The End of Rendering: Zoic Studios’ Aaron Sternlicht on Realtime Engines in VFX Production

Originally published on IDesignYourEyes on 1/6/2010.

Zoic created this Killzone 2 commercial spot entirely within the Killzone 2 engine.

The level of the technology available to produce computer graphics is approaching a new horizon, and video games are part of the equation.

Creators in 3D animation and visual effects are used to lengthy, hardware-intensive render times for the highest quality product. But increasingly, productions are turning to realtime rendering engines, inspired by the video games industry, to aid in on-set production and to create previz animations. Soon, even the final product will be rendered in realtime.

Aaron Sternlicht, Zoic Studios’ Executive Producer of Games, has been producing video game trailers, commercials, and cinematics since the turn of the millennium. He has charted the growth of realtime engines in 3D animation production, and is now part of Zoic’s effort to incorporate realtime into television VFX production, using the studio’s new ZEUS pipeline (read about ZEUS here).

Sternlicht explains how realtime engines are currently used at Zoic, and discusses the future of the technology.

“The majority of what we do for in-engine realtime rendering is for in-game cinematics and commercials. We can take a large amount of the heavy-lifting in CG production, and put it into a game engine. It allows for quick prototyping, and allows us to make rapid changes on-the-fly. We found that changing cameras, scenes, set-ups, even lighting can be a fraction of the workload that it is in traditional CG.

“Right now, you do give up some levels of quality, but when you’re doing something that’s stylized, cel-shaded, cartoonish, or that doesn’t need to be on a photo-realistic level, it’s a great tool and a cost effective one.

We’re going to be able to radically alter the cost structures of producing CG.

“Where we’re heading though, from a production standpoint, is being able to create a seamless production workflow, where you build the virtual set ahead of time; go to your greenscreen and motion capture shoot; and have realtime rendering of your characters, with lighting, within the virtual environment, shot by a professional DP, right there on-set. You can then send shots straight from the set to Editorial, and figure out exactly what you need to focus on for additional production — which can create incredible efficiencies.

“In relation to ZEUS, right now with [ABC’s sci-fi series] V, we’re able to composite greenscreen actors in realtime onto CG back plates that are coming straight out of the camera source. We’re getting all the camera and tracking data and compositing real-time, right there. Now if you combine that with CG characters that can be realtime, in-engine rendered, you then can have live action actors on greenscreen and CG characters fully lit, interacting and rendered all in realtime.

“People have been talking about realtime VFX for the last 15 years, but now it’s something you’re seeing actually happening. With V we have a really good opportunity. We’re providing realtime solutions in ways that haven’t been done before.

“Now there’s been a threshold to producing full CG episodic television. There has been a lot of interest in finding a solution to generate stylized and high quality CG that can be produced inexpensively, or at least efficiently. A process that allows someone to kick out 22 minutes of scripted full CG footage within a few weeks of production is very difficult to do right now, within budgetary realities.

“But with in-engine realtime productions, we can get a majority of our footage while we’re actually shooting the performance capture. This is where it gets really exciting, opening an entire new production workflow, and where I see the future of full CG productions.”

What game-based engines have Zoic used for realtime rendering?

“We’ve done a several productions using the Unreal 3 engine. We’ve done productions with the Killzone 2 engine as well. We’re testing out different proprietary systems, including StudioGPU’s MachStudio Pro, which is being created specifically with this type of work in mind.

“If you’re doing a car spot, you can come in here and say ‘okay, I want to see the new Dodge driving through the salt flats.’ We get your car model, transfer that to an engine, in an environment that’s lit and realtime rendered, within a day. We even hand you a camera, that a professional DP can actually shoot with on-site here, and you can produce final-quality footage within a couple of days. It’s pretty cool.”

How has the rise of realtime engines in professional production been influenced by the rise of amateur Machinima?

“I’ve been doing game trailers since 2000. I’ve been working with studios to design toolsets for in-game capture since then as well. What happened was, you had a mixture of the very apt and adept gamers who could go in and break code, or would use say the Unreal 2 engine, to create their own content. Very cool, very exciting.

“Concurrently, you had companies like Electronic Arts, and Epic, and other game studios and publishers increasing the value of their product by creating tool sets to let you capture and produce quality game play — marketing cameras that are spline-based, where you can adjust lighting and cameras on-the-fly. This provided a foundation of toolsets and production flow that has evolved into today’s in-engine solutions.”

It’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future.

How has this affected traditional producers of high-end software?

“It hasn’t really yet. There’s still a gap in quality. We can’t get the quality of a mental ray or RenderMan render out of a game engine right now.

“But the process is not just about realtime rendering, but also realtime workflow. For example, if we’re doing an Unreal 3 production, we may not be rendering in realtime. We’ll be using the engine to render, instead of 30 or 60 frames a second, we may render one frame every 25 seconds, because we’re using all the CPU power to render out that high-quality image. That said, the workflow is fully realtime, where we’re able to adjust lighting, shading, camera animation, tessellation, displacement maps — all realtime, in-engine, even though the final product may be rendering out at a non-realtime rate.

“Some of these engines, like Studio GPU, are rendering out passes. We actually get a frame-buffered pass system out of an engine, so we can do secondary composites.

“With the rise of GPU technology, it’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future. Artists, rather than waiting on renders to figure out how their dynamic lighting is working, or how their subsurface scattering is working, will dial that in, in realtime, make adjustments, and never actually have to render to review. It’s really remarkable.”

So how many years until the new kids in VFX production don’t even know what “render time” means?

“I think we’re talking about the next five years. Obviously there will be issues of how far we can push this and push that; and we’re always going to come up with something that will add one more layer to the complexity of any given scene. That said, yes, we’re going to be able to radically alter the cost structures of producing CG, and very much allow it to be a much more artist-driven. I think in the next five years… It’s all going to change.”

Read Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century.