The End of Rendering: Zoic Studios’ Aaron Sternlicht on Realtime Engines in VFX Production

Originally published on IDesignYourEyes on 1/6/2010.

Zoic created this Killzone 2 commercial spot entirely within the Killzone 2 engine.

The level of the technology available to produce computer graphics is approaching a new horizon, and video games are part of the equation.

Creators in 3D animation and visual effects are used to lengthy, hardware-intensive render times for the highest quality product. But increasingly, productions are turning to realtime rendering engines, inspired by the video games industry, to aid in on-set production and to create previz animations. Soon, even the final product will be rendered in realtime.

Aaron Sternlicht, Zoic Studios’ Executive Producer of Games, has been producing video game trailers, commercials, and cinematics since the turn of the millennium. He has charted the growth of realtime engines in 3D animation production, and is now part of Zoic’s effort to incorporate realtime into television VFX production, using the studio’s new ZEUS pipeline (read about ZEUS here).

Sternlicht explains how realtime engines are currently used at Zoic, and discusses the future of the technology.

“The majority of what we do for in-engine realtime rendering is for in-game cinematics and commercials. We can take a large amount of the heavy-lifting in CG production, and put it into a game engine. It allows for quick prototyping, and allows us to make rapid changes on-the-fly. We found that changing cameras, scenes, set-ups, even lighting can be a fraction of the workload that it is in traditional CG.

“Right now, you do give up some levels of quality, but when you’re doing something that’s stylized, cel-shaded, cartoonish, or that doesn’t need to be on a photo-realistic level, it’s a great tool and a cost effective one.

We’re going to be able to radically alter the cost structures of producing CG.

“Where we’re heading though, from a production standpoint, is being able to create a seamless production workflow, where you build the virtual set ahead of time; go to your greenscreen and motion capture shoot; and have realtime rendering of your characters, with lighting, within the virtual environment, shot by a professional DP, right there on-set. You can then send shots straight from the set to Editorial, and figure out exactly what you need to focus on for additional production — which can create incredible efficiencies.

“In relation to ZEUS, right now with [ABC’s sci-fi series] V, we’re able to composite greenscreen actors in realtime onto CG back plates that are coming straight out of the camera source. We’re getting all the camera and tracking data and compositing real-time, right there. Now if you combine that with CG characters that can be realtime, in-engine rendered, you then can have live action actors on greenscreen and CG characters fully lit, interacting and rendered all in realtime.

“People have been talking about realtime VFX for the last 15 years, but now it’s something you’re seeing actually happening. With V we have a really good opportunity. We’re providing realtime solutions in ways that haven’t been done before.

“Now there’s been a threshold to producing full CG episodic television. There has been a lot of interest in finding a solution to generate stylized and high quality CG that can be produced inexpensively, or at least efficiently. A process that allows someone to kick out 22 minutes of scripted full CG footage within a few weeks of production is very difficult to do right now, within budgetary realities.

“But with in-engine realtime productions, we can get a majority of our footage while we’re actually shooting the performance capture. This is where it gets really exciting, opening an entire new production workflow, and where I see the future of full CG productions.”

What game-based engines have Zoic used for realtime rendering?

“We’ve done a several productions using the Unreal 3 engine. We’ve done productions with the Killzone 2 engine as well. We’re testing out different proprietary systems, including StudioGPU’s MachStudio Pro, which is being created specifically with this type of work in mind.

“If you’re doing a car spot, you can come in here and say ‘okay, I want to see the new Dodge driving through the salt flats.’ We get your car model, transfer that to an engine, in an environment that’s lit and realtime rendered, within a day. We even hand you a camera, that a professional DP can actually shoot with on-site here, and you can produce final-quality footage within a couple of days. It’s pretty cool.”

How has the rise of realtime engines in professional production been influenced by the rise of amateur Machinima?

“I’ve been doing game trailers since 2000. I’ve been working with studios to design toolsets for in-game capture since then as well. What happened was, you had a mixture of the very apt and adept gamers who could go in and break code, or would use say the Unreal 2 engine, to create their own content. Very cool, very exciting.

“Concurrently, you had companies like Electronic Arts, and Epic, and other game studios and publishers increasing the value of their product by creating tool sets to let you capture and produce quality game play — marketing cameras that are spline-based, where you can adjust lighting and cameras on-the-fly. This provided a foundation of toolsets and production flow that has evolved into today’s in-engine solutions.”

It’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future.

How has this affected traditional producers of high-end software?

“It hasn’t really yet. There’s still a gap in quality. We can’t get the quality of a mental ray or RenderMan render out of a game engine right now.

“But the process is not just about realtime rendering, but also realtime workflow. For example, if we’re doing an Unreal 3 production, we may not be rendering in realtime. We’ll be using the engine to render, instead of 30 or 60 frames a second, we may render one frame every 25 seconds, because we’re using all the CPU power to render out that high-quality image. That said, the workflow is fully realtime, where we’re able to adjust lighting, shading, camera animation, tessellation, displacement maps — all realtime, in-engine, even though the final product may be rendering out at a non-realtime rate.

“Some of these engines, like Studio GPU, are rendering out passes. We actually get a frame-buffered pass system out of an engine, so we can do secondary composites.

“With the rise of GPU technology, it’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future. Artists, rather than waiting on renders to figure out how their dynamic lighting is working, or how their subsurface scattering is working, will dial that in, in realtime, make adjustments, and never actually have to render to review. It’s really remarkable.”

So how many years until the new kids in VFX production don’t even know what “render time” means?

“I think we’re talking about the next five years. Obviously there will be issues of how far we can push this and push that; and we’re always going to come up with something that will add one more layer to the complexity of any given scene. That said, yes, we’re going to be able to radically alter the cost structures of producing CG, and very much allow it to be a much more artist-driven. I think in the next five years… It’s all going to change.”

Read Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century.

Zoic Presents: The Creatures of ‘Fringe’ – Part 2

Originally published on I Design Your Eyes on 12/24/09.

lionzard_630x354

This is second part of a two-part interview with Zoic Studios senior compositor Johnathan R. Banta, about creatures designed for the Fox sci-fi drama Fringe. Be sure to read part one.

The Lionzard (from episode 1:16, “Unleashed”)

In this first-season episode, anarchists opposed to animal testing ransack a research laboratory, but get more than they bargain for when they unleash a ferocious transgenic creature. Later, Walter faces off against the creature in the sewers.

Banta says, “It was a lion-lizard combination, a chimera of a bunch of different creatures created in a lab. This also went through the ZBrush pipeline. There were no maquettes done for this particular one.

“This was a full-digital creature; luckily it did not interact too tightly with any of the actors. It was rigged up and had a muscle system that allowed for secondary dynamics. The textures and displacement maps were painted locally. There was some post lighting to add extra slime, with everything done inside the composite.

“It was actually very straightforward in its approach. The challenge of course was getting it to be lit properly and integrated in the shot. Compositing was a heavy challenge, as there was lot of haze on the set, a lot of lens flares – not direct flares, but gradients from different lights and so forth. We did our best to match the color space of the original photography. I think it was very effective.

“Another challenge was the bits of slime; it had to have slobber coming off of it. So we actually shot some practical elements; we did some digital cloth elements, a combination of things.”

monitorhand2_630x354

The Hand (from episode 1:12, “The No-Brainer”)

A seventeen-year-old is working at his computer and chatting on the phone, when a mysterious computer program executes. Strange images flash before his eyes, and the teen is drawn in, mesmerized. Something protrudes from the middle of the screen and impossibly takes the form of a hand. The unearthly appendage reaches forward without warning and grasps his face.

Banta explains: “This boy spends a little too much time on the computer, and a hand reaches out of the computer, grabs his face, and begins to jostle him around and melt his brain. Which is not unlike my experience as a youth.

“We made a series of maquettes and we photographed them, just different positions of the hand coming out; and we composited them into a couple of shots. At the same time the animation was being worked on in CG, so we could start previsualizing it and then composite it.

“A cloth simulation was used for the screen. The hand was coming out, and we would create several different morph targets based on that cloth simulation. There was a bone rig in there, so we could animate it grabbing the kid’s head. That’s some very effective work, especially when projecting the textures on. The side view of the hand coming out of the monitor is one of my favorite shots.

“What they had on set was a monitor made of plastic, and a greenscreen fabric with a slot in it [where the screen would be] – and they had some poor guy in a greenscreen suit shove his hand through and grab the kid on the head, and the kid wiggled around.

“So we had to paint back and remove the actor, whenever he was touching the kid; otherwise we would use a clean plate. But whenever he was touching the young actor, we would remove that hand and replace it.

“They were also flashing an interactive light on the young actor that was not accurate to what we were rendering. When the hand got close it would actually light up his face, because the hand was illuminated with television images. So we came up with a way of match-moving his animation, and using that to relight his performance. We had to match his animation for the hand to interact with him, but we also used that match move to relight his performance.“

tentacles2_630x354

The Tentacle Parasite (from episode 2:09, “Snakehead”)

A wet, shivering man frantically combs the streets of Boston’s Chinatown. Gaining refuge, he suffers incredible stomach pains. His rescuer puts on heavy gloves and uses shears to cut his shirt away. The man’s abdomen is distended and wriggling as something crawls around inside him. A squid-like parasite crawls out of the man’s mouth, and rescuer retrieves it.

“Recently we just did yet another thing coming out of a poor guy’s mouth,” Banta says. “This time it wasn’t just nice little potato-shaped slug — it was long and tentacled, had sharp bits and just looked pretty nasty to have shoved down your throat.”

But there was an additional challenge on this effect. “You were seeing the creature moving underneath the actor’s skin; the actor’s shirt was off, and he was wiggling around on the ground as he probably would if this were happening, like a dead fish. He was shifting all over the place, his skin was moving all over the place, and we had to actually take full control of that.

“So we did match move. We went to our performance transfer system, which essentially takes tracking information from the original plate and assigns is to the match move. There are no specific camera set-ups; it’s just whatever they give us, and we grab every bit of information from the plate that we can, and use that to modify the 3D performances. These were then projected onto animation that we used to distend the belly and so forth, and up into the throat.

“The creature had 18 tentacles. Ray Harryhausen, when he did an octopus, decided to take two of the tentacles off, because he wouldn’t have to animate those, it would take less time. We didn’t have that luxury. There was no way to procedurally animate these things, and it had to interact with the guy’s face. So we had the exact same challenge we had with the slug coming out of the mouth, that we had to take this actor and pull his face apart as well, and make his lips go wider. But this actor was moving a lot more, so the performance transfer and animation tracking was more challenging.

But I’m very pleased with the results. We used fabric simulations for the different bits of slime again.

razorbutterflies_630x354

Razor Butterflies (from episode 1:09, “The Dreamscape”)

A young executive arrives late to give a presentation. After he has finished and the boardroom empties, he collects his things, and spots a butterfly. It alights on his finger — and unexpectedly cuts him. The insect flutters by his neck — and cuts him again. After attacking a few more times, the creature disappears into an AC vent. The man peers into the vent just as a swarm of butterflies pours out. They surround him, cutting him all over his body — he runs in a mad panic, crashing through a plate glass window and falling to his death.

Banta says, “We tracked every camera in the scene and laid it out into one common environment, so we could reuse any lighting in any point in the scene. That gave us the ability to put the flock of razor-winged butterflies into the appropriate spot.

“A big challenge on its own was volume — controlling and dictating the flocking behavior, so the swarm would follow the actor, intersect with him in the appropriate parts and not intersect in others, and eventually chase him through the window where the would fall to his horrible demise.

“There was one close-up of a butterfly resting on his finger — it flew into frame and landed, it was brilliant – that was pretty straightforward in its execution. More often than not the hard part was controlling the sheer number of flocking butterflies, especially given our standard turnaround time.”

Banta is thrilled to be creating otherworldly monsters for JJ Abrams’ Fringe. “I like doing these creatures; I hope we get to do more!”

Read Part 1

Ripomatics and Animatics: Storyboards for the 21st Century

Originally published on I Design Your Eyes on 12/11/09.

A screenshot of a “test” animatic produced by Zoic.

In the beginning was the storyboard, a series of illustrations displayed in sequence to pre-visualize a screenplay or teleplay, and to map out such elements as camera moves, blocking and effects. The modern storyboard was pioneered by one of the entertainment industry’s greatest innovators, Walt Disney, specifically for traditional cel animation. But the technique soon moved into feature film production, and later television, commercials, interactive media and video games — even web site design.

The next evolution in previsualization also came from animation. An animatic is a series of storyboard illustrations arranged on film or video, incorporating timing, simple movement, and sometimes dialogue and music. By making editing and story decisions at the animatic stage, animators can avoid the wasteful process of animating scenes that would eventually have been edited down or cut entirely.

More recently, ripomatics have evolved to help filmmakers design and express the look and feel of a project before any shooting or animating takes place. Originally developed in the commercial production industry, ripomatics are like animatics, but assembled from elements of previous films, television shows, and commercials; plus still images and other preexisting assets. A ripomatic for a television commercial might be composed entirely of clips from other commercials for similar products, combined with new music and messaging. They are often used to pitch projects to clients.

Zoic Studios is pioneering the next phase in storyboard evolution, offering a new kind of animated storyboard that lives halfway between existing animatics or ripomatics and a full 3D animated previsualization.

Zoic Studios compositor Levi Ahmu says “ripomatics were originally designed to make a moving storyboard. And when I got here [to Zoic], I thought it would be cool if we could enhance it a little bit.

bullet_630x354A screenshot of an animatic created by Zoic for a commercial,
for Guerrilla Games’ Killzone 2, entitled “Bullet.”

“The problem with storyboards and making them move [is] the storyboard is very flat. By cutting up the storyboard into layers, you can give 3D motion to it, which is what you’re eventually going to be doing anyway. It gives artists and clients a better sense of what’s going to happen. It also helps you time things out better; you have actual motion in the storyboards, so you can get a more relative frame count of what the product will be.”

But even these animatics gave only what Ahmu calls a “vague representation” of the final product. “So what we ended up doing was creating these 3D environments in a 2D setting. We’re taking 2D cards and arranging them so they’ll represent a room or a street or any kind of environment; then having a virtual camera move through that environment. You can take the 2D actors from the storyboard and put them in this environment; and the advantage of doing it this way is you’ll be able to have a [virtual] camera, with lens properties and animation curves that are more easily equated to what the 3D artists will wind up having to do.

“It’s all being done in Adobe After Effects, which is not at all what the software makers were intending. But the cool thing about doing it in After Effects is that you can put in particles, stuff you would never get in traditional previz, that enhance the experience. “

Some more elaborate ripomatics prepared by Zoic have included 3D vehicle models composed from 2D drawings; rough motion capture; and dialogue, sound effects and music.

Zoic executive producer Aaron Sternlicht, head of the studio’s Games Division, has supervised Ahmu in the production of a number of advanced ripomatics for a variety of clients over the last several years.

saboteur_630x354A screenshot of a ripomatic created for Pandemic Studios’ The Saboteur.

“It’s kind of like a 2½D ripomatic or animatic,” Sternlicht says. “We actually do all of our storyboards so that they’re laid out in layers, which actually allows us to get into production a lot more easily. We’re able to have an edit that is exciting, entertaining and really good to look at, for our clients to view within a few days, as opposed to having a rudimentary gray-shaded previz or just edited storyboards.

“The big reason we like working this way is that we’re able to have clients pretty much sign off on shot design, composition and pacing of camera work in 2D before we ever go to 3D. That allows us to be a lot more efficient once we go to 3D, and [to] give our artists a real clear path of what they’re supposed to be doing once we start building the scenes. So it’s a tremendous tool for us.

“Clients love it because they quickly get to see a massive leap from looking at storyboards to really understanding what the quality of the piece is going to be, the timing, and how exciting it might end up being. So we’re pretty psyched by the whole process.”

Ahmu agrees that clients are benefiting from the new technique. “As opposed to a traditional previz, which is all gray-shaded, and doesn’t have very much ambiance to it, a ripomatic the way we’ve been doing it can have stylized textures, rough animation, that will get the point across in such a way that it’s not like previz where it’s the first step. This is our goal, to have this motion, with these effects on top of it. You can get a rough idea of what the whole thing is supposed to be.”

falling1_630x354Another screenshot of a “test” animatic produced by Zoic.

Sternlicht is quick to point out that advanced ripomatics not only better represent the final product, but also save both Zoic Studios and its clients time and money. Even a complex animatic composed of multiple, animated elements can be produced in only a few days. And because the client is able to sign off on so many elements of the final product while still in the 2D stage, Zoic saves time and effort, and can pass that savings along to the client.

Zoic has applied the technique to video game and commercial projects, and plans to offer advanced ripomatics to its feature film and television clients where appropriate. “We have just had more opportunities for video games to implement it,” Sternlicht explains, “because we often are responsible for direction and creative.

“I think it’s already being used [in TV and feature work]. The technique we’re using is a little more advanced than what is commonly done. But we’re really pushing our ripomatics more towards motion comics, than necessarily your standard edited storyboard. So, full animation of characters, full animation of vehicles, full animation of camera, full animation of effects. It’s really kind of the whole package.

“It’s part of our service. It’s part of working with Zoic and being creative.”