Planes, Trains & Automatic Weapons: Zoic Provides Explosive VFX for FOX’s Human Target

humantargetplane_630x354

Based loosely on the DC comic series of the same name, Human Target is an action-drama starring Mark Valley (Boston Legal) as security expert Christopher Chance, with Chi McBride (Boston Public) and Jackie Earle Haley (Watchmen). It airs Wednesdays at 8pm on FOX.

Zoic Studios provided a number of visual effects shots for the series, including for the pilot episode. Zoic creative director Andrew Orloff discusses the studio’s work on Human Target.

“The question is, how do you do a super-sized action movie every week?” Orloff asks. The answer? Invisible effects, stunt enhancement, special effects and pyro enhancement. “There are all kinds of things, from a bullet train, to a HALO jump, to a large passenger airplane flying upside down in a storm, to a fight on a gondola suspended above a ravine. There are a lot of explosions – exploding boats, exploding trains, exploding buildings, and large set pieces.”

The largest set piece Zoic did was for the pilot episode, which took place almost entirely on a bullet train. Since America doesn’t have bullet trains, the team created the train station and landing. Both the 3D train and the landing were designed and created at Zoic.

“When [the characters] get on the train, what they are really stepping into is a greenscreen with a hole in it,” Orloff explains. “Then when they are on the train, the outside we see through the windows is a plate, which we shot via helicopter.

“We flew out to central California from Van Nuys airport; and flew at low altitude over the train tracks, making multiple passes going forward and back, and side-to-side. We used those helicopter plates to make exteriors to be seen from the windows inside the train.

“We also shot a ton of aerial establishing shots, which was a fun thing to do. We planned out the helicopter day by going on Google Earth and identifying where all the train tracks are. It was supposed to be a bullet train from San Francisco to Los Angeles, so we were looking at the tracks around San Luis Obispo, and the ones a little more inland towards Tehachapi. We plotted out the course, and got our passes. Those were tracked in 3D, and the 3D train was put in on top of them.”

Another episode features a scene in which a CG passenger jet, flying through a storm, flips all the way over and then back again. “We couldn’t use any existing model of a passenger jet for legal reasons,” Orloff says. “So we had to take an existing jet model, modify it, and change up the existing engine configuration so it was more generic.

“We did a dozen shots of the plane at night, in clouds, with rain and lightning strikes, flipping over and right side up again, with smoke trailing back from it.” The production built a full-sized cockpit mock-up on a greenscreen stage, which could be rotated 360-degrees and upside down. This greenscreen footage was integrated into the CG airplane shots.

One episode portrayed a HALO jump. “It was interesting and challenging – we shot the main character on greenscreen, and added a whole aerial background, where we see clouds behind him. We enhanced the wind blowing in his face, and created a CG parachute that opens up and floats to the ground.”

Other VFX for Human Target are less spectacular, but just as important to creating the world of the show. In his review of the pilot episode, USA Today reviewer Robert Bianco wrote that the “confined-spaces fight on the train is a miniature marvel of its kind.” Orloff says there have been several confined gunfights on the show, and that it’s not safe to shoot with blanks in such tight quarters. As a result, Zoic creates and enhances muzzle flashes for the gunfight scenes, even for an underwater gunfight.

There were also a lot of set extensions. “There’s a big show where they escape from a building by climbing around in the ventilation and elevator shafts,” Orloff says, “and those were all shot on small set pieces, with greenscreen work extending the ventilation shafts up and down in this 50-storey building. There was an elevator shaft, that was a set that with two floors of elevator; we extended it, and the characters were zip-lining down the elevator cables.”

There are many wire and rig removals, and other stunt enhancements, “like when they’re coming down the zipline in the elevator shaft. They’re using a homemade rig in the story, but it’s a real rig and we erase that. There’s also a motorcycle jump off these big steps, and there were wires holding the motorcycle upright; and we’re erasing that. They’re fighting on a gondola, and they’re getting knocked over and flying off; there are all kinds of rigs and harnesses keeping the actors from falling off the gondola, that we erase.

“We did an episode where we blew up a building. We were using pyro and glass elements that were shot on our soundstage, along with special effects elements used to create CG fire. We do miniature shoots sometimes; do a small explosion and comp it into a larger piece. In the pilot, we blew up the wall of an office building. We shot that with no explosion, and then we went on a separate day, made a small quarter-scale version of that set and then blew it up.

“It’s a really interesting show; it’s a variety of challenges. It’s a different thing every week. It’s all based on real world phenomena, and it’s important to the show that this exists in the real world. We did a shot where there’s a DC Metro station. It was shot in Vancouver in a hotel lobby, and they greenscreened one side; we made a subway tunnel on that side, and brought a CG train into it. It’s a lot of stuff like that — expanding the scope of Chance’s world, bringing him to different environments and helping with these various moving action set pieces.

“You have these really cool shots you’d expect in a feature film. In the pilot there’s a shot from outside the train car, where they’re running from car to car to car and you’re seeing it through the windows. And there is actually no train – all that stuff is put in. When they go through a tunnel, there’s no tunnel. We’re doing all that.

“It’s a fun show. There’s a lot of work that might go unnoticed, but it really contributes to the believability and the scope of what they’re trying to accomplish.”

More info: Human Target official website; “Give ‘Human Target’ a shot, and it could just be a bull’s-eye” on USAToday.

Zoic Brings Photo-real CG to Broadcast TV with ESPN NASCAR “Dominoes”

Originally published on IDesignYourEyes on 2/2/2010.

ESPN NASCAR "Dominoes" spot

To the opening riffs of Metallica’s “Master of Puppets,” two NASCAR drivers jostle for position at the front of the pack. One cuts off the other by the wall, and the rear car speeds up, smashing into the front car. As the front car drifts from the wall, the rear car makes its move, attempting an aggressive pass on the right. But it’s no good – he sideswipes the front car and spins out. He’s slammed by another car and flips high into the air, triggering a massive pile-up. And straight through the smoke and chaos of the pileup – a third driver makes his move and takes the lead. “It’s anybody’s race.”

The 30-second spot for ESPN (see it here), promoting the NASCAR Nationwide series, was created by advertising agency Wieden+Kennedy New York and Culver City, California’s Zoic Studios. The commercial is significant because, despite its unique and stylized black-and-white look, it appears to have been shot in live action. In fact, it’s entirely CG.

Zoic co-founder Loni Peristere, who directed the spot, talks about why the commercial was created digitally, and how Zoic was able to create the illusion of perfect realism.

“The question from Wieden+Kennedy was, ‘we have a project, two scripts, which take place on the track, and would require significant action and stunt work. We’re trying to decide whether we should approach this from a live-action standpoint; or should we approach this from an animation standpoint.”

Wieden+Kennedy insisted the final product be photo-realistic; the agency did not want a commercial that looked like a video game.

But Wieden+Kennedy was insistent that the final product must appear perfectly photo-realistic. Peristere says the agency did not want a commercial that looked like a video game. “It was really important to them that it had the energy, grit and testosterone of the track. They were not interested in making a spot that didn’t have the reality of NASCAR.”

The agency was well aware how far CG realism has recently progressed. “Even in the last 12 months it has come a long way,” Peristere says. “With the advent of motion pictures like Avatar or The Curious Case of Benjamin Button, we are seeing the potential for photo-real characters, photo-real environments, and photo-real action. But could we actually achieve that for a commercial, and could we afford it? What would the timeline be?

“We got boards for both spots, and it became readily apparent why they were even asking this question – they had a 40-car pileup in the middle of the first spot, and a pretty significant crash in the second. Now when you looked at the second spot, you thought ‘well, from a production standpoint you could probably pull that off’; in fact we’d done something similar for Budweiser the year before. But the 40-car pileup featured just an enormous amount of damage to an enormous number of vehicles, which from a production standpoint would be very expensive.

“And the ability to control the lighting and the camera and the art direction would be limited in a live action production. You would be fighting against the sun, making you rush through the shots, allowing you limited control over your color palette. And you would have the expense of wrecking an enormous number of vehicles.”

Peristere discussed the project with other principals at Zoic – fellow co-founder Chris Jones, commercial creative director Leslie Ekker, commercial executive producer Erik Press, and CG supervisor Andy Wilkoff. “We thought it would be fun to rise to the challenge,” Peristere says. “We knew the team we had been building over the last several years had the potential to do incredible photo-realistic work. We’d seen large leaps in the realm of photo-real characters. We came back to Wieden+Kennedy and said ‘yes, yes we can.’”

ESPN NASCAR "Dominoes" spot

Deciding to do the spot in CG led to the first question – should the drivers’ faces be represented in the spot? Human characters are the most difficult thing to create realistically in CG. “From a directorial standpoint,” Peristere says, “I felt it was absolutely essential to see the drivers, to understand who they were, and to know what their motivations were so we had a personal connection to the race. I had the ever-present voice of [Buffy the Vampire Slayer and Firefly series creator] Joss Whedon in my head, who says ‘it’s all about the story; it’s all about the people.’

“We enlisted the help of some incredibly talented artists, including Brad Hayes, Brian White, and Michael Cliett.” Hayes and White had worked at Digital Domain on Benjamin Button and more recently on Tron Legacy, and had been a part of the development of a character-based VFX pipeline.

The technique used for “Dominoes” involved projecting the actual NASCAR drivers’ faces onto CG characters, allowing Peristere complete control over movement and lighting while still getting full, photo-realistic facial performances.

“Andy [Wilkoff] and I went to the very last race at Daytona, and after race day we met with the eight stars of our two commercials. We ran them though some technical setups, which involved a three-camera shoot against a greenscreen. I directed them through a series of emotions and actions that related to the story we were telling. We then took those performances back to Zoic, made editorial selects based on those performances, and gave them to Brad and Andy and the smart people to make something cool with.”

Dmitri Gueer, founder and senior editor of Zoic Editorial, was involved in the “Dominoes” spot from the pre-viz stage through the final product. He describes the editorial process as “non-stop,” and uses the facial performances as an example of Editorial’s involvement at each step.

“The pre-viz had the drivers, but we didn’t see their faces,” Gueer explains. “So the drivers were just a placeholder in the cut. When we later got the driver plates, we started picking the selects and placing them in the cut. Since the pre-viz already existed, you needed to find takes that worked for the placeholders.

“When you have the drivers’ faces mapped in the shots, it becomes apparent when we need to give them a little bit more time, or take a little time from them, because something’s not working out; and once you have a set of almost-final shots, the edit takes on a different spin. You need to pick the sweetest spots in the shots; you need to reestablish the pacing; you need to make sure there’s continuity from shot to shot; and that the edit comes together not just as a story, but also that it gels with the music and is captivating to watch.”

“We had the added complexity of a 40-car pileup,” Peristere says, “which involved extensive damage to CG vehicles, but which had to happen organically. That was hand-developed and designed by Brian White, another Digital Domain veteran with an intimate knowledge of physics and kinetics, who was able to use both animation-by-hand and procedural techniques to bring these cars into collision. You’ll see that every vehicle reacts and behaves just as a real car would as it impacts. When we have our big moment where we t-bone the hero car, you actually see it break where it should break, and that’s because Brian White made it so.”

I was looking to invoke the German Expressionist period, so I wanted these incredibly long shadows, with crushed blacks.

The spot also required an enormous smoke simulation. “Whenever these cars spin they generate tons of smoke. We worked closely with Zoic Vancouver, and a number of technical directors up in that office who specialize in smoke; they did the phenomenal nuclear explosion scene in the forthcoming movie The Crazies, for which they developed a lot of the pipeline for this — which involves Maya fluid dynamics, along with some techniques in RF4 Real Flow — so they could generate authentic smoke elements that gave the illusion and sense of a full-scale car accident on a NASCAR track.

nascardominoes3_630x354

“Kevin Struckman, Mike Rhone, and Trevor Adams all put in an incredible number of hours to make these smoke simulations incredibly spectacular, concluding with the hero car penetrating the giant smoke cloud, creating those beautiful little vortices that you see. That’s something that’s pretty tricky in a fluid simulation, and they were able to do a really nice job with that.”

In order for the spot to come together organically, there was an immense amount of compositing. “We brought in real smoke, spark, and pyro elements to underline the CG elements. Also, every single one of the 27 shots in this 30-second spot had upwards of hundreds of passes– lighting, reflections, highlights, lens flares, vignettes, grain – all of this stuff that had to be added as a secondary layer.”

The spot was rendered in full color, but the end product was always intended to be in a highly-stylized black-and-white. “That was a choice we made with Wieden+Kennedy, to create a style, a more graphic look. For me it was heading towards the films Alfred Hitchcock made in the 40s and 50s, and looking back even further to F.W. Murnau and Sunrise, and Fritz Lang and Metropolis. I was looking to invoke the German Expressionist period, so I wanted these incredibly long shadows, with crushed blacks. You’ll see a low sun – I call that the Ridley Scott sun, because Ridley Scott shoots at the magic hour all the time, and we wanted to put that in every shot. You’ll see these incredibly long film-noir shadows with bright brights, and black blacks.

nascarconcept_630x354

“Then we wanted to include the branding of Nationwide; so we applied the Nationwide presence as a design element. We had an illustrator, Eytan Zana, who did a phenomenal job setting the tone and palette.” Zana worked with Wieden+Kennedy, and with Derich Wittliff and Darrin Isono of Zoic’s design department, applying the Nationwide Pantone color to the stickers, the cars, and the track.

Peristere says, “I think overall, this black, white and blue we put together in the compositing really lends an original look to this spot that’s unlike anything we’ve seen before.”

Zoic VFX supervisor Steve Meyer handled the final finish, color grading and color treatment. “We wanted to have sort of a Raging Bull kind of look, high contrast black-and-white. So the compositors left things a little bit more on the flat side to give range; and then I took that, got the style Loni [Peristere] was looking for, and added some of those little nuances like the road rumble, the extra shake when something flies by camera, that kind of overall stuff.

“It’s a stylized look that you could attribute to real photography. I’ve been in the business for a bit, and it blows me away when I see it. Wow, that’s frickin’ all CG? It’s a very impressive spot. I was glad to be a part of it, because I think it’s going to have some legs.”

In the end, it was up to editor Gueer to assemble the finished shots into the final product. “It was a non-stop editorial process, from the beginning when Loni was assembling the story, to the time when we had all the final shots on the Flame. One of the things Steve [Meyer] did was add camera shakes to the shots, which made them look much better; but it changes the nature of what you’re seeing, even the slightest shake. You go well, wouldn’t it be better if we cut a few frames from this, or extended it by a few frames? When we had the final shots on the Flame, we literally did editorial on the Flame, making it better and better and tighter and tighter.”

“With this giant team of 40 some-odd people who worked on this spot, it’s certainly one of Zoic’s finest hours,” Peristere says, “and we’re incredibly proud to have put it together.”

People look at this spot and say “where did you guys shoot this?” Well, we didn’t shoot it!

Press is thankful to Wieden+Kennedy for trusting Zoic with the production of such an innovative and risk-taking spot. “They had faith in us and patience with us, and that was really great, because it really took that to produce this spot. It was a great experience on both sides. They gave us a lot of creative freedom, to really bring out the best in us. We pushed ourselves really hard to the level of realism and level of detail.

“I mean this kind of work, this animation, the quality level, is something very new for broadcast,” he says. “The extent to which we have gone to produce this spot in a visual style, in CG animation, has really never been done before. It’s a full 100% photo-real CG spot.

“NASCAR is very concerned about representing their world accurately, which was a big challenge for all of us, both from an agency side and a production side. Down to the decals on the cars, and the physics of the accidents, what would really get damaged and what wouldn’t, where would skid marks be made on the track… So people look at this spot and say ‘where did you guys shoot this?’ Well, we didn’t shoot it!

“The music was Metallica – my understanding is they’ve never licensed their music for broadcast commercials before. That was exciting from the get go — definitely a driving force creatively, no pun intended, the kind of energy that brings to the spot.”

Press says the spot has exceeded everyone’s expectations. “We’ve seen that response all the way around, from the agency, from our colleagues in the advertising world, and from ourselves as well – it’s really some of our best work. We’ve really set the bar anew; there’s a new target for us now, which is fantastic.”

More info: ESPN NASCAR “Dominoes” on Zoic Studios; Wieden+Kennedy.

Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century

Originally published by IDesignYourEyes on 1/7/2010.

zeus_v_greenscreen_630x354Actors Christopher Shyer and Morena Baccarin on the greenscreen set of ABC’s V; the virtual set is overlaid.

Visual effects professionals refer to the chain of processes and technologies used to produce an effects shot as a “pipeline,” a term borrowed both from traditional manufacturing and from computer architecture.

In the past year, Zoic Studios has developed a unique pipeline product called ZEUS. The showiest of ZEUS’ capabilities is to allow filmmakers on a greenscreen set to view the real-time rendered virtual set during shooting; but ZEUS does far more than that.

Zoic Studios pipeline supervisor Mike Romey explains that the pipeline that would become ZEUS was originally developed for the ABC science fiction series V. “We realized working on the pilot that we needed to create a huge number of virtual sets. [Read this for a discussion of the program’s VFX and its virtual sets.] That led us to try to find different components we could assemble and bind together, that could give us a pipeline that would let us successfully manage the volume of virtual set work we were doing for V. And, while ZEUS is a pipeline that was built to support virtual sets for V, it also fulfills the needs of our studio at large, for every aspect of production.

“One of its components is the Lightcraft virtual set tracking system, which itself is a pipeline of different components. These include InterSense motion tracking, incorporating various specialized NVIDIA graphics cards for I/O out, as well as custom inertial sensors for rotary data for the camera.

“Out of the box, we liked the Lightcraft product the most. We proceeded to build a pipeline around it that could support it.

“Our studio uses a program called Shotgun, a general-purpose database system geared for project shot management, and we were able to tailor it to support the virtual set tracking technology. By coming up with custom tools, we were able to take the on-set data, use Shotgun as a means to manage it, then lean on Shotgun to retrieve the data for custom tools throughout our pipeline. When an artist needed to set up or lay out a scene, we built tools to query Shotgun for the current plate, the current composite that was done on set, the current asset, and the current tracking data; and align them all to the timecode based on editorial selects. Shotgun was where the data was all stored, but we used Autodesk Maya as the conduit for the 3D data – we were then able to make custom tools that transport all the layout scenes from Maya to The Foundry’s Nuke compositing software.”

By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot.

Romey explains the rationale behind creating 3D scenes in Nuke. “When when you look at these episodic shows, there’s a large volume of shots that are close-up, and a smaller percentage of establishing shots; so we could use Nuke’s compositing application to actually do our 3D rendering. In Maya we would be rendering a traditional raytrace pipeline; but for Nuke we could render a scanline pipeline, which didn’t have same overhead. Also, this would give the compositing team immediate access to the tools they need to composite the shot faster, and it let them be responsible for a lot of the close up shots. Then our 3D team would be responsible for the establishing shots, which we knew didn’t have the quality constraints necessary for a scanline render.

“By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot, because we didn’t have to provide the 3D support necessary. That’s how the ZEUS pipeline evolved, with that premise – how do we meet our client’s costs and exceed their visual expectations, without breaking the bank? Throughout the ZEUS pipeline, with everything that we did, we tried to find methodologies that would shave off time, increase quality, and return a better product to the client.

“One of the avenues we R&Ded to cut costs was the I/O time. We found that we were doing many shots that required multiple plates. A new component we looked at was a product that had just been released, called Ki Pro from AJA.

“When I heard about this product, I immediately contacted AJA and explained our pipeline. We have a lot of on-set data – we the have tracking data being acquired, the greenscreen, a composite, and the potential for the key being acquired. The problem is when we went back to production, the I/O time associated with managing all the different plates became astronomical.

“Instead of running a Panasonic D5 deck to record the footage, we could use the Ki Pro, which is essentially a tapeless deck, on-set to record directly to Apple ProRes codecs. The units were cost effective – they were about $4,000 per unit – so we could set up multiple units on stage, and trigger them to record, sync and build plates that all were the exact same length, which directly corresponded to our tracking data.”

We found methodologies that would shave off time, increase quality, and return a better product to the client.

Previously, the timecode would be lost when Editorial made their selects, and would have to be reestablished. “That became a very problematic process, which would take human intervention to do — there was a lot of possibility for human error. By introducing multiple Ki Pros into the pipeline, we could record each plate, and take that back home, make sure the layout was working, and then wait for the editorial select.” The timecode from the set was preserved.

“The ZEUS pipeline is really about a relationship of image sequence to timecode. Any time that relationship is broken, or becomes more convoluted or complicated to reestablish, it introduces more human error. By relieving the process of human error, we’re able to control our costs. We can offer this pipeline to clients who need the Apple ProRes 442 codec, and at the end of the day we can take the line item of I/O time and costs, and dramatically reduce it.”

Another important component is Python, the general-purpose high-level programming language. “Our pipeline is growing faster than we can train people to use it. The reason we were able to build the ZEUS pipeline the way we have, and build it out within a month’s time, is because we opted to use tools like Python. It has given us the ability to quickly and iteratively develop tools that respond proactively to production.

“One case in point – when we first started working with the tracking data for V, we quickly realized it didn’t meet our needs. We were using open source formats such as COLLADA, which are XML scene files that stored the timecode. We needed custom tools to trim, refine and ingest the COLLADA data into our Shotgun database, into the Maya cameras, into the Nuke preferences and Nuke scenes. Python gave us the ability to do that. It’s the glue that binds our studio.

“While most components in our pipeline are interchangeable, I would argue that Python is the one component that is irreplaceable. The ability to iteratively making changes on the fly during an episode could not have been deployed and developed using other tools. It would not have been as successful, and I think it would have taken a larger development team. We don’t have a year to do production, like Avatar – we have weeks. And we don’t have a team of developers, we have one or two.

While most components in our pipeline are interchangeable, Python is the one component that is irreplaceable.

“We’re kind of new to the pipeline game. We’ve only been doing a large amount of pipeline development for two years. What we’ve done is taken some rigid steps, to carve out our pipeline such a way that when we build a tool, it can be shared across the studio.”

Romey expects great things from ZEUS in the future. “We’re currently working on an entire episodic season using ZEUS. We’re working out the kinks. From time to time there are little issues and hiccups, but that’s traditional for developing and growing a pipeline. What we’ve found is that our studio is tackling more advanced technical topics – we’re doing things like motion capture and HDR on-set tracking. We’re making sure that we have a consistent and precise road map of how everything applies in our pipeline.

“With ZEUS, we’ve come up with new ways that motion capture pipelines can work. In the future we’d like to be able to provide our clients with a way not only to be on set and see what the virtual set looks like, while the director is working — but what if the director could be on set with the virtual set, with the actor in the motion capture suit, and see the actual CG character, all in context, in real-time, on stage? Multiple characters! What if we had background characters that were all creatures, and foreground characters that were people, interacting? Quite honestly, given the technology of Lightcraft and our ability to do strong depth-of-field, we could do CG characters close-to-final on stage. I think that’s where we’d like the ZEUS pipeline to go in the future.

“Similar pipelines have been done for other productions. But in my experience, a lot of times they are one-off pipelines. ZEUS is not a pipeline just for one show; it’s a pipeline for our studio.

“It’s cost effective, and we think can get the price point to meet the needs of all our clients, including clients with smaller budgets, like webisodes. The idea of doing an Avatar-like production for a webisode is a stretch; but if we build our pipeline in such a way that we can support it, we can find new clients, and provide them with a better product.

“Our main goal with ZEUS was to find ways to make that kind of pipeline economical, to make it grow and mature. We’ve treated every single component in the pipeline as a dependency that can be interchanged if it doesn’t meet our needs, and we’re willing to do so until we get the results that we need.”

For more info: Lightcraft Technology; InterSense Inc.; Shotgun Software; AJA Video Systems; IDYE’s coverage of V.

The End of Rendering: Zoic Studios’ Aaron Sternlicht on Realtime Engines in VFX Production

Originally published on IDesignYourEyes on 1/6/2010.

Zoic created this Killzone 2 commercial spot entirely within the Killzone 2 engine.

The level of the technology available to produce computer graphics is approaching a new horizon, and video games are part of the equation.

Creators in 3D animation and visual effects are used to lengthy, hardware-intensive render times for the highest quality product. But increasingly, productions are turning to realtime rendering engines, inspired by the video games industry, to aid in on-set production and to create previz animations. Soon, even the final product will be rendered in realtime.

Aaron Sternlicht, Zoic Studios’ Executive Producer of Games, has been producing video game trailers, commercials, and cinematics since the turn of the millennium. He has charted the growth of realtime engines in 3D animation production, and is now part of Zoic’s effort to incorporate realtime into television VFX production, using the studio’s new ZEUS pipeline (read about ZEUS here).

Sternlicht explains how realtime engines are currently used at Zoic, and discusses the future of the technology.

“The majority of what we do for in-engine realtime rendering is for in-game cinematics and commercials. We can take a large amount of the heavy-lifting in CG production, and put it into a game engine. It allows for quick prototyping, and allows us to make rapid changes on-the-fly. We found that changing cameras, scenes, set-ups, even lighting can be a fraction of the workload that it is in traditional CG.

“Right now, you do give up some levels of quality, but when you’re doing something that’s stylized, cel-shaded, cartoonish, or that doesn’t need to be on a photo-realistic level, it’s a great tool and a cost effective one.

We’re going to be able to radically alter the cost structures of producing CG.

“Where we’re heading though, from a production standpoint, is being able to create a seamless production workflow, where you build the virtual set ahead of time; go to your greenscreen and motion capture shoot; and have realtime rendering of your characters, with lighting, within the virtual environment, shot by a professional DP, right there on-set. You can then send shots straight from the set to Editorial, and figure out exactly what you need to focus on for additional production — which can create incredible efficiencies.

“In relation to ZEUS, right now with [ABC’s sci-fi series] V, we’re able to composite greenscreen actors in realtime onto CG back plates that are coming straight out of the camera source. We’re getting all the camera and tracking data and compositing real-time, right there. Now if you combine that with CG characters that can be realtime, in-engine rendered, you then can have live action actors on greenscreen and CG characters fully lit, interacting and rendered all in realtime.

“People have been talking about realtime VFX for the last 15 years, but now it’s something you’re seeing actually happening. With V we have a really good opportunity. We’re providing realtime solutions in ways that haven’t been done before.

“Now there’s been a threshold to producing full CG episodic television. There has been a lot of interest in finding a solution to generate stylized and high quality CG that can be produced inexpensively, or at least efficiently. A process that allows someone to kick out 22 minutes of scripted full CG footage within a few weeks of production is very difficult to do right now, within budgetary realities.

“But with in-engine realtime productions, we can get a majority of our footage while we’re actually shooting the performance capture. This is where it gets really exciting, opening an entire new production workflow, and where I see the future of full CG productions.”

What game-based engines have Zoic used for realtime rendering?

“We’ve done a several productions using the Unreal 3 engine. We’ve done productions with the Killzone 2 engine as well. We’re testing out different proprietary systems, including StudioGPU’s MachStudio Pro, which is being created specifically with this type of work in mind.

“If you’re doing a car spot, you can come in here and say ‘okay, I want to see the new Dodge driving through the salt flats.’ We get your car model, transfer that to an engine, in an environment that’s lit and realtime rendered, within a day. We even hand you a camera, that a professional DP can actually shoot with on-site here, and you can produce final-quality footage within a couple of days. It’s pretty cool.”

How has the rise of realtime engines in professional production been influenced by the rise of amateur Machinima?

“I’ve been doing game trailers since 2000. I’ve been working with studios to design toolsets for in-game capture since then as well. What happened was, you had a mixture of the very apt and adept gamers who could go in and break code, or would use say the Unreal 2 engine, to create their own content. Very cool, very exciting.

“Concurrently, you had companies like Electronic Arts, and Epic, and other game studios and publishers increasing the value of their product by creating tool sets to let you capture and produce quality game play — marketing cameras that are spline-based, where you can adjust lighting and cameras on-the-fly. This provided a foundation of toolsets and production flow that has evolved into today’s in-engine solutions.”

It’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future.

How has this affected traditional producers of high-end software?

“It hasn’t really yet. There’s still a gap in quality. We can’t get the quality of a mental ray or RenderMan render out of a game engine right now.

“But the process is not just about realtime rendering, but also realtime workflow. For example, if we’re doing an Unreal 3 production, we may not be rendering in realtime. We’ll be using the engine to render, instead of 30 or 60 frames a second, we may render one frame every 25 seconds, because we’re using all the CPU power to render out that high-quality image. That said, the workflow is fully realtime, where we’re able to adjust lighting, shading, camera animation, tessellation, displacement maps — all realtime, in-engine, even though the final product may be rendering out at a non-realtime rate.

“Some of these engines, like Studio GPU, are rendering out passes. We actually get a frame-buffered pass system out of an engine, so we can do secondary composites.

“With the rise of GPU technology, it’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future. Artists, rather than waiting on renders to figure out how their dynamic lighting is working, or how their subsurface scattering is working, will dial that in, in realtime, make adjustments, and never actually have to render to review. It’s really remarkable.”

So how many years until the new kids in VFX production don’t even know what “render time” means?

“I think we’re talking about the next five years. Obviously there will be issues of how far we can push this and push that; and we’re always going to come up with something that will add one more layer to the complexity of any given scene. That said, yes, we’re going to be able to radically alter the cost structures of producing CG, and very much allow it to be a much more artist-driven. I think in the next five years… It’s all going to change.”

Read Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century.

Zoic Presents: The Creatures of ‘Fringe’ – Part 2

Originally published on I Design Your Eyes on 12/24/09.

lionzard_630x354

This is second part of a two-part interview with Zoic Studios senior compositor Johnathan R. Banta, about creatures designed for the Fox sci-fi drama Fringe. Be sure to read part one.

The Lionzard (from episode 1:16, “Unleashed”)

In this first-season episode, anarchists opposed to animal testing ransack a research laboratory, but get more than they bargain for when they unleash a ferocious transgenic creature. Later, Walter faces off against the creature in the sewers.

Banta says, “It was a lion-lizard combination, a chimera of a bunch of different creatures created in a lab. This also went through the ZBrush pipeline. There were no maquettes done for this particular one.

“This was a full-digital creature; luckily it did not interact too tightly with any of the actors. It was rigged up and had a muscle system that allowed for secondary dynamics. The textures and displacement maps were painted locally. There was some post lighting to add extra slime, with everything done inside the composite.

“It was actually very straightforward in its approach. The challenge of course was getting it to be lit properly and integrated in the shot. Compositing was a heavy challenge, as there was lot of haze on the set, a lot of lens flares – not direct flares, but gradients from different lights and so forth. We did our best to match the color space of the original photography. I think it was very effective.

“Another challenge was the bits of slime; it had to have slobber coming off of it. So we actually shot some practical elements; we did some digital cloth elements, a combination of things.”

monitorhand2_630x354

The Hand (from episode 1:12, “The No-Brainer”)

A seventeen-year-old is working at his computer and chatting on the phone, when a mysterious computer program executes. Strange images flash before his eyes, and the teen is drawn in, mesmerized. Something protrudes from the middle of the screen and impossibly takes the form of a hand. The unearthly appendage reaches forward without warning and grasps his face.

Banta explains: “This boy spends a little too much time on the computer, and a hand reaches out of the computer, grabs his face, and begins to jostle him around and melt his brain. Which is not unlike my experience as a youth.

“We made a series of maquettes and we photographed them, just different positions of the hand coming out; and we composited them into a couple of shots. At the same time the animation was being worked on in CG, so we could start previsualizing it and then composite it.

“A cloth simulation was used for the screen. The hand was coming out, and we would create several different morph targets based on that cloth simulation. There was a bone rig in there, so we could animate it grabbing the kid’s head. That’s some very effective work, especially when projecting the textures on. The side view of the hand coming out of the monitor is one of my favorite shots.

“What they had on set was a monitor made of plastic, and a greenscreen fabric with a slot in it [where the screen would be] – and they had some poor guy in a greenscreen suit shove his hand through and grab the kid on the head, and the kid wiggled around.

“So we had to paint back and remove the actor, whenever he was touching the kid; otherwise we would use a clean plate. But whenever he was touching the young actor, we would remove that hand and replace it.

“They were also flashing an interactive light on the young actor that was not accurate to what we were rendering. When the hand got close it would actually light up his face, because the hand was illuminated with television images. So we came up with a way of match-moving his animation, and using that to relight his performance. We had to match his animation for the hand to interact with him, but we also used that match move to relight his performance.“

tentacles2_630x354

The Tentacle Parasite (from episode 2:09, “Snakehead”)

A wet, shivering man frantically combs the streets of Boston’s Chinatown. Gaining refuge, he suffers incredible stomach pains. His rescuer puts on heavy gloves and uses shears to cut his shirt away. The man’s abdomen is distended and wriggling as something crawls around inside him. A squid-like parasite crawls out of the man’s mouth, and rescuer retrieves it.

“Recently we just did yet another thing coming out of a poor guy’s mouth,” Banta says. “This time it wasn’t just nice little potato-shaped slug — it was long and tentacled, had sharp bits and just looked pretty nasty to have shoved down your throat.”

But there was an additional challenge on this effect. “You were seeing the creature moving underneath the actor’s skin; the actor’s shirt was off, and he was wiggling around on the ground as he probably would if this were happening, like a dead fish. He was shifting all over the place, his skin was moving all over the place, and we had to actually take full control of that.

“So we did match move. We went to our performance transfer system, which essentially takes tracking information from the original plate and assigns is to the match move. There are no specific camera set-ups; it’s just whatever they give us, and we grab every bit of information from the plate that we can, and use that to modify the 3D performances. These were then projected onto animation that we used to distend the belly and so forth, and up into the throat.

“The creature had 18 tentacles. Ray Harryhausen, when he did an octopus, decided to take two of the tentacles off, because he wouldn’t have to animate those, it would take less time. We didn’t have that luxury. There was no way to procedurally animate these things, and it had to interact with the guy’s face. So we had the exact same challenge we had with the slug coming out of the mouth, that we had to take this actor and pull his face apart as well, and make his lips go wider. But this actor was moving a lot more, so the performance transfer and animation tracking was more challenging.

But I’m very pleased with the results. We used fabric simulations for the different bits of slime again.

razorbutterflies_630x354

Razor Butterflies (from episode 1:09, “The Dreamscape”)

A young executive arrives late to give a presentation. After he has finished and the boardroom empties, he collects his things, and spots a butterfly. It alights on his finger — and unexpectedly cuts him. The insect flutters by his neck — and cuts him again. After attacking a few more times, the creature disappears into an AC vent. The man peers into the vent just as a swarm of butterflies pours out. They surround him, cutting him all over his body — he runs in a mad panic, crashing through a plate glass window and falling to his death.

Banta says, “We tracked every camera in the scene and laid it out into one common environment, so we could reuse any lighting in any point in the scene. That gave us the ability to put the flock of razor-winged butterflies into the appropriate spot.

“A big challenge on its own was volume — controlling and dictating the flocking behavior, so the swarm would follow the actor, intersect with him in the appropriate parts and not intersect in others, and eventually chase him through the window where the would fall to his horrible demise.

“There was one close-up of a butterfly resting on his finger — it flew into frame and landed, it was brilliant – that was pretty straightforward in its execution. More often than not the hard part was controlling the sheer number of flocking butterflies, especially given our standard turnaround time.”

Banta is thrilled to be creating otherworldly monsters for JJ Abrams’ Fringe. “I like doing these creatures; I hope we get to do more!”

Read Part 1