Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century

Originally published by IDesignYourEyes on 1/7/2010.

zeus_v_greenscreen_630x354Actors Christopher Shyer and Morena Baccarin on the greenscreen set of ABC’s V; the virtual set is overlaid.

Visual effects professionals refer to the chain of processes and technologies used to produce an effects shot as a “pipeline,” a term borrowed both from traditional manufacturing and from computer architecture.

In the past year, Zoic Studios has developed a unique pipeline product called ZEUS. The showiest of ZEUS’ capabilities is to allow filmmakers on a greenscreen set to view the real-time rendered virtual set during shooting; but ZEUS does far more than that.

Zoic Studios pipeline supervisor Mike Romey explains that the pipeline that would become ZEUS was originally developed for the ABC science fiction series V. “We realized working on the pilot that we needed to create a huge number of virtual sets. [Read this for a discussion of the program’s VFX and its virtual sets.] That led us to try to find different components we could assemble and bind together, that could give us a pipeline that would let us successfully manage the volume of virtual set work we were doing for V. And, while ZEUS is a pipeline that was built to support virtual sets for V, it also fulfills the needs of our studio at large, for every aspect of production.

“One of its components is the Lightcraft virtual set tracking system, which itself is a pipeline of different components. These include InterSense motion tracking, incorporating various specialized NVIDIA graphics cards for I/O out, as well as custom inertial sensors for rotary data for the camera.

“Out of the box, we liked the Lightcraft product the most. We proceeded to build a pipeline around it that could support it.

“Our studio uses a program called Shotgun, a general-purpose database system geared for project shot management, and we were able to tailor it to support the virtual set tracking technology. By coming up with custom tools, we were able to take the on-set data, use Shotgun as a means to manage it, then lean on Shotgun to retrieve the data for custom tools throughout our pipeline. When an artist needed to set up or lay out a scene, we built tools to query Shotgun for the current plate, the current composite that was done on set, the current asset, and the current tracking data; and align them all to the timecode based on editorial selects. Shotgun was where the data was all stored, but we used Autodesk Maya as the conduit for the 3D data – we were then able to make custom tools that transport all the layout scenes from Maya to The Foundry’s Nuke compositing software.”

By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot.

Romey explains the rationale behind creating 3D scenes in Nuke. “When when you look at these episodic shows, there’s a large volume of shots that are close-up, and a smaller percentage of establishing shots; so we could use Nuke’s compositing application to actually do our 3D rendering. In Maya we would be rendering a traditional raytrace pipeline; but for Nuke we could render a scanline pipeline, which didn’t have same overhead. Also, this would give the compositing team immediate access to the tools they need to composite the shot faster, and it let them be responsible for a lot of the close up shots. Then our 3D team would be responsible for the establishing shots, which we knew didn’t have the quality constraints necessary for a scanline render.

“By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot, because we didn’t have to provide the 3D support necessary. That’s how the ZEUS pipeline evolved, with that premise – how do we meet our client’s costs and exceed their visual expectations, without breaking the bank? Throughout the ZEUS pipeline, with everything that we did, we tried to find methodologies that would shave off time, increase quality, and return a better product to the client.

“One of the avenues we R&Ded to cut costs was the I/O time. We found that we were doing many shots that required multiple plates. A new component we looked at was a product that had just been released, called Ki Pro from AJA.

“When I heard about this product, I immediately contacted AJA and explained our pipeline. We have a lot of on-set data – we the have tracking data being acquired, the greenscreen, a composite, and the potential for the key being acquired. The problem is when we went back to production, the I/O time associated with managing all the different plates became astronomical.

“Instead of running a Panasonic D5 deck to record the footage, we could use the Ki Pro, which is essentially a tapeless deck, on-set to record directly to Apple ProRes codecs. The units were cost effective – they were about $4,000 per unit – so we could set up multiple units on stage, and trigger them to record, sync and build plates that all were the exact same length, which directly corresponded to our tracking data.”

We found methodologies that would shave off time, increase quality, and return a better product to the client.

Previously, the timecode would be lost when Editorial made their selects, and would have to be reestablished. “That became a very problematic process, which would take human intervention to do — there was a lot of possibility for human error. By introducing multiple Ki Pros into the pipeline, we could record each plate, and take that back home, make sure the layout was working, and then wait for the editorial select.” The timecode from the set was preserved.

“The ZEUS pipeline is really about a relationship of image sequence to timecode. Any time that relationship is broken, or becomes more convoluted or complicated to reestablish, it introduces more human error. By relieving the process of human error, we’re able to control our costs. We can offer this pipeline to clients who need the Apple ProRes 442 codec, and at the end of the day we can take the line item of I/O time and costs, and dramatically reduce it.”

Another important component is Python, the general-purpose high-level programming language. “Our pipeline is growing faster than we can train people to use it. The reason we were able to build the ZEUS pipeline the way we have, and build it out within a month’s time, is because we opted to use tools like Python. It has given us the ability to quickly and iteratively develop tools that respond proactively to production.

“One case in point – when we first started working with the tracking data for V, we quickly realized it didn’t meet our needs. We were using open source formats such as COLLADA, which are XML scene files that stored the timecode. We needed custom tools to trim, refine and ingest the COLLADA data into our Shotgun database, into the Maya cameras, into the Nuke preferences and Nuke scenes. Python gave us the ability to do that. It’s the glue that binds our studio.

“While most components in our pipeline are interchangeable, I would argue that Python is the one component that is irreplaceable. The ability to iteratively making changes on the fly during an episode could not have been deployed and developed using other tools. It would not have been as successful, and I think it would have taken a larger development team. We don’t have a year to do production, like Avatar – we have weeks. And we don’t have a team of developers, we have one or two.

While most components in our pipeline are interchangeable, Python is the one component that is irreplaceable.

“We’re kind of new to the pipeline game. We’ve only been doing a large amount of pipeline development for two years. What we’ve done is taken some rigid steps, to carve out our pipeline such a way that when we build a tool, it can be shared across the studio.”

Romey expects great things from ZEUS in the future. “We’re currently working on an entire episodic season using ZEUS. We’re working out the kinks. From time to time there are little issues and hiccups, but that’s traditional for developing and growing a pipeline. What we’ve found is that our studio is tackling more advanced technical topics – we’re doing things like motion capture and HDR on-set tracking. We’re making sure that we have a consistent and precise road map of how everything applies in our pipeline.

“With ZEUS, we’ve come up with new ways that motion capture pipelines can work. In the future we’d like to be able to provide our clients with a way not only to be on set and see what the virtual set looks like, while the director is working — but what if the director could be on set with the virtual set, with the actor in the motion capture suit, and see the actual CG character, all in context, in real-time, on stage? Multiple characters! What if we had background characters that were all creatures, and foreground characters that were people, interacting? Quite honestly, given the technology of Lightcraft and our ability to do strong depth-of-field, we could do CG characters close-to-final on stage. I think that’s where we’d like the ZEUS pipeline to go in the future.

“Similar pipelines have been done for other productions. But in my experience, a lot of times they are one-off pipelines. ZEUS is not a pipeline just for one show; it’s a pipeline for our studio.

“It’s cost effective, and we think can get the price point to meet the needs of all our clients, including clients with smaller budgets, like webisodes. The idea of doing an Avatar-like production for a webisode is a stretch; but if we build our pipeline in such a way that we can support it, we can find new clients, and provide them with a better product.

“Our main goal with ZEUS was to find ways to make that kind of pipeline economical, to make it grow and mature. We’ve treated every single component in the pipeline as a dependency that can be interchanged if it doesn’t meet our needs, and we’re willing to do so until we get the results that we need.”

For more info: Lightcraft Technology; InterSense Inc.; Shotgun Software; AJA Video Systems; IDYE’s coverage of V.

The End of Rendering: Zoic Studios’ Aaron Sternlicht on Realtime Engines in VFX Production

Originally published on IDesignYourEyes on 1/6/2010.

Zoic created this Killzone 2 commercial spot entirely within the Killzone 2 engine.

The level of the technology available to produce computer graphics is approaching a new horizon, and video games are part of the equation.

Creators in 3D animation and visual effects are used to lengthy, hardware-intensive render times for the highest quality product. But increasingly, productions are turning to realtime rendering engines, inspired by the video games industry, to aid in on-set production and to create previz animations. Soon, even the final product will be rendered in realtime.

Aaron Sternlicht, Zoic Studios’ Executive Producer of Games, has been producing video game trailers, commercials, and cinematics since the turn of the millennium. He has charted the growth of realtime engines in 3D animation production, and is now part of Zoic’s effort to incorporate realtime into television VFX production, using the studio’s new ZEUS pipeline (read about ZEUS here).

Sternlicht explains how realtime engines are currently used at Zoic, and discusses the future of the technology.

“The majority of what we do for in-engine realtime rendering is for in-game cinematics and commercials. We can take a large amount of the heavy-lifting in CG production, and put it into a game engine. It allows for quick prototyping, and allows us to make rapid changes on-the-fly. We found that changing cameras, scenes, set-ups, even lighting can be a fraction of the workload that it is in traditional CG.

“Right now, you do give up some levels of quality, but when you’re doing something that’s stylized, cel-shaded, cartoonish, or that doesn’t need to be on a photo-realistic level, it’s a great tool and a cost effective one.

We’re going to be able to radically alter the cost structures of producing CG.

“Where we’re heading though, from a production standpoint, is being able to create a seamless production workflow, where you build the virtual set ahead of time; go to your greenscreen and motion capture shoot; and have realtime rendering of your characters, with lighting, within the virtual environment, shot by a professional DP, right there on-set. You can then send shots straight from the set to Editorial, and figure out exactly what you need to focus on for additional production — which can create incredible efficiencies.

“In relation to ZEUS, right now with [ABC’s sci-fi series] V, we’re able to composite greenscreen actors in realtime onto CG back plates that are coming straight out of the camera source. We’re getting all the camera and tracking data and compositing real-time, right there. Now if you combine that with CG characters that can be realtime, in-engine rendered, you then can have live action actors on greenscreen and CG characters fully lit, interacting and rendered all in realtime.

“People have been talking about realtime VFX for the last 15 years, but now it’s something you’re seeing actually happening. With V we have a really good opportunity. We’re providing realtime solutions in ways that haven’t been done before.

“Now there’s been a threshold to producing full CG episodic television. There has been a lot of interest in finding a solution to generate stylized and high quality CG that can be produced inexpensively, or at least efficiently. A process that allows someone to kick out 22 minutes of scripted full CG footage within a few weeks of production is very difficult to do right now, within budgetary realities.

“But with in-engine realtime productions, we can get a majority of our footage while we’re actually shooting the performance capture. This is where it gets really exciting, opening an entire new production workflow, and where I see the future of full CG productions.”

What game-based engines have Zoic used for realtime rendering?

“We’ve done a several productions using the Unreal 3 engine. We’ve done productions with the Killzone 2 engine as well. We’re testing out different proprietary systems, including StudioGPU’s MachStudio Pro, which is being created specifically with this type of work in mind.

“If you’re doing a car spot, you can come in here and say ‘okay, I want to see the new Dodge driving through the salt flats.’ We get your car model, transfer that to an engine, in an environment that’s lit and realtime rendered, within a day. We even hand you a camera, that a professional DP can actually shoot with on-site here, and you can produce final-quality footage within a couple of days. It’s pretty cool.”

How has the rise of realtime engines in professional production been influenced by the rise of amateur Machinima?

“I’ve been doing game trailers since 2000. I’ve been working with studios to design toolsets for in-game capture since then as well. What happened was, you had a mixture of the very apt and adept gamers who could go in and break code, or would use say the Unreal 2 engine, to create their own content. Very cool, very exciting.

“Concurrently, you had companies like Electronic Arts, and Epic, and other game studios and publishers increasing the value of their product by creating tool sets to let you capture and produce quality game play — marketing cameras that are spline-based, where you can adjust lighting and cameras on-the-fly. This provided a foundation of toolsets and production flow that has evolved into today’s in-engine solutions.”

It’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future.

How has this affected traditional producers of high-end software?

“It hasn’t really yet. There’s still a gap in quality. We can’t get the quality of a mental ray or RenderMan render out of a game engine right now.

“But the process is not just about realtime rendering, but also realtime workflow. For example, if we’re doing an Unreal 3 production, we may not be rendering in realtime. We’ll be using the engine to render, instead of 30 or 60 frames a second, we may render one frame every 25 seconds, because we’re using all the CPU power to render out that high-quality image. That said, the workflow is fully realtime, where we’re able to adjust lighting, shading, camera animation, tessellation, displacement maps — all realtime, in-engine, even though the final product may be rendering out at a non-realtime rate.

“Some of these engines, like Studio GPU, are rendering out passes. We actually get a frame-buffered pass system out of an engine, so we can do secondary composites.

“With the rise of GPU technology, it’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future. Artists, rather than waiting on renders to figure out how their dynamic lighting is working, or how their subsurface scattering is working, will dial that in, in realtime, make adjustments, and never actually have to render to review. It’s really remarkable.”

So how many years until the new kids in VFX production don’t even know what “render time” means?

“I think we’re talking about the next five years. Obviously there will be issues of how far we can push this and push that; and we’re always going to come up with something that will add one more layer to the complexity of any given scene. That said, yes, we’re going to be able to radically alter the cost structures of producing CG, and very much allow it to be a much more artist-driven. I think in the next five years… It’s all going to change.”

Read Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century.

Why Are Firefly/Serenity Fans So Devoted… Even After All These Years?

Originally published on I Design Your Eyes on 12/1/09.

A model of Serenity.

Last month, the Los Angeles Airport Marriott hosted Creation Entertainment’s Salute to Firefly & Serenity, a small but well-attended fan convention featuring appearances by series actors Jewel Staite, Adam Baldwin, and Morena Baccarin & Alan Tudyk, both also from ABC’s V.

Of course Firefly is the science-fiction dramatic series broadcast on the Fox Network in 2002-2003, created by Joss Whedon of Buffy the Vampire Slayer and Angel fame. Canceled after only 11 episodes aired, the show has since engendered a major Hollywood motion picture (2005’s Serenity), a novel, a role-playing game, two comics series, soundtracks, a slew of merchandise & collectibles, and countless hand-knitted orange “cunning hats.”

I stopped by to get an idea of what’s going on with Firefly flans*, and to find out the answer to the question, Why are people still so devoted to a show that had only 14 episodes (and a movie), after nearly a decade?

Here are some answers from convention-goers, from commenters on fireflyfans.net, and from Zoic Studios co-founder Loni Peristere.

The Browncoats, a Firefly-themed band.
The Browncoats, a Firefly-themed band from St. Louis, Missouri.

Some credited the show’s realism, like Co-Pilot Gary Miller of The Browncoats, a Firefly-themed band from St. Louis. “[It’s] because Firefly feels so real. It’s a sci-fi show without aliens. It’s about real people and real-life types of situations — in the future. Not to mention the dialogue, the acting, and the story are all brilliant.”

For me, it was all about the writing. The dialogue, and the way the characters were developed through dialogue, were just brilliant. I especially loved the dialogue for River Tam (Summer Glau of Terminator: The Sarah Connor Chronicles), the ship’s ultra-violent fugitive waif — she rarely spoke, but when she did, it was always a bizarre window into her disordered mind. And usually either disturbing or hilarious.

On fireflyfans.net, hughff says: “I agree that the writing is the key. Too frequently today, television and especially film concentrate on the visual image. However, great films/shows recognize that it’s a synthesis of both visual images and dialogue.

“There was never any doubt from the very start that Firefly had the dialogue right. More than what it told us about the characters per se, I liked what it showed about their interrelationships. The verbal exchanges between Mal and Inara; the way Jayne treated Kaylee like a little sister, the way that Mal’s trust and respect for Simon grew incrementally — these were important to the flavor of the show.

“The show didn’t avoid complexity — these were real people living in a messy (i.e. real) world (alright, worlds) and as such, things were never simple.

“Finally, and Zoic can take more than a little credit for this, the show did have some great visual images: the Reaver ship sliding past in absolute silence; Crow disappearing through the air intake; Serenity rising up the cliff after the bar fight. The off-center and shaky ‘hand held’ camera work, even in the CGI, began a trend that has become everyday (Bourne Ultimatum, Battlestar Galactica) but broke new ground for me. When I first saw the first episode I thought, ‘How could they be so amateur?’ But by the end I was hooked into the vision and never let it go.”

Firefly-themed collectibles on sale in the dealer’s room.
Firefly-themed collectibles on sale in the dealer’s room.

One of the most interesting answers came from Dwight Bragdon, Board Member of the California Browncoats, a San Diego-based non-profit that promotes Firefly and Serenity fandom through charity. Since 2007 they have raised over $100,000 for charities like Equality Now and St. Jude Children’s Research Hospital. “We are still in love with Firefly ten years later because of the type of people the show attracts. We’re smart, funny and caring, and we took our energy and enthusiasm for the ‘Verse and turned it into a community of giving….

“We can also see how much the cast and crew cared about the ‘Verse too… They lead by example too with their charity. [Actor] Nathan [Fillion] co-founded Kids Need to Read with author P.J. Haarsma; [actor] Adam Baldwin shows great support to the Marine Corps – Law Enforcement Foundation; Joss [Whedon] is a great supporter of Equality Now; and the list goes on.

“These guys and girls are people that I am proud to call friends, proud to call family and I wouldn’t trade them for the world.”

For Beth Nelson, Chairman of the Austin Browncoats, another charitable non-profit based in Texas, the message of Firefly is hope. “People want to root for the underdog, because for many of us, we’re the underdogs right now. Firefly gives us that hope and inspiration. Firefly and Serenity tell the story of people who might have been forgotten, left behind, taken for granted — but if they work together, they can accomplish anything…

“So much of it has to do with how well the characters were developed and how sincere and believable the dialogue was – which is something Joss is known for… We’re all flawed; we can all identify with characters who… sometimes pick the wrong path, even with the best intentions.

“In the end, though, I think we all love what Firefly has become. Firefly went from being this amazing space western to so much more. Outside of the ‘Verse itself, the fans have become a family, a movement that got together to do more than just love a television show or a movie. Numerous fans are working towards charitable goals – ending violence and discrimination or making sure every kid has the wealth of knowledge literature can bring them.”

The dealer’s room.
The dealer’s room.

Loni Peristere was directly involved in the production of Firefly and Serenity, as visual effects supervisor. He created the Firefly-class spaceship Serenity, along with Whedon and production designer Carey Meyer. “When Joss first told me about the new show,” Peristere said, “he told me to read The Killer Angels,” the 1974 historical novel by Michael Shaara, which tells the story of the Battle of Gettysburg from the Confederate perspective. The novel inspired Whedon to create Firefly.

Firefly is about not fitting in, about finding a place for yourself in a world where you don’t fit, finding a family and making a living,” Peristere explained. “There are very few shows out there where the stars are outcasts, who join together as a family, which as Joss says is what ‘makes them mighty.’ None of the characters fit in – Nathan is a Browncoat [stand-in for Confederate]; Morena [Baccarin’s character] is a whore; there’s the fugitive; the tomboy; the interracial couple; the weary shepherd; the mercenary who’s incapable of doing anything else. They would all be loners, if they didn’t band together.

“How Zoic was part of that, is we made the viewer a ‘welcome voyeur.’ The camera followed the emotional beats. By using a handheld camera on-set and a ‘handheld’ camera effect for the CG exteriors, we put the viewer in the emotional center of the story. The viewer is a voyeuristic participant – another outcast, a part of the crew.”

Peristere also feels a special kinship with the Firefly cast and crew. “We knew it was important. We fell in love with it because it was a great story to tell. The show was made by creative people we loved and respected for their bravery, because they embraced the outcast. All the creative people I respect the most come from the cast and crew of Firefly. It was a moment that’s impossible to recapture.”

One last reason the flans and Browncoats stay devoted – because Firefly died too soon. From Jaydepps on fireflyfans.net: “Another reason it is still relevant is because of how abruptly it was cut [off], and it never received closure. We’ve been thirsting for more. A good TV series goes for a decent amount of seasons until the story is filled in, mostly. Then the series leaves TV… Firefly was never given the chance to do this.”

More info: Creation Entertainment; the discussion on fireflyfans.net; The Browncoats website and on MySpace; California Browncoats; Austin Browncoats.

If you want to know why they call us “flans,” just read this aloud: “Firefly fan.”

Zoic Brings Visitors to Earth for ABC’s ‘V’

Originally published on I Design Your Eyes on 10/2/09.

v-screencap-manhattanship_630x354A Visitor mothership hovers over Manhattan.

Tomorrow evening (11/3/09), ABC will broadcast the premiere episode of its highly anticipated new sci-fi series V, which updates and re-imagines the original 1983 miniseries of the same name. The visual effects for the new V were created by Culver City, California’s Zoic Studios, known for providing VFX for a number of well-loved science fiction franchises.

Scott Peters, creator of The 4400, brings fans a modern take on the classic V that pays loving homage to its 80s inspiration. Written by Peters and directed by Yves Simoneau, the pilot episode stars Elizabeth Mitchell (Lost), Morris Chestnut (Kung Fu Panda 2), Joel Gretsch (The 4400, Taken); and Firefly alumni Morena Baccarin and Alan Tudyk.

The remake hews closely to the story of the original: mile-wide alien motherships appear above the major cities of the Earth. The aliens call themselves “The Visitors,” and appear to be identical to humans. They claim to come in peace, seeking to trade advanced technology for resources. But the Visitors are not what they seem, and hide sinister intentions. While much of humanity welcomes the Visitors, a resistance movement begins to form.

Four episodes will air this month; the show will return from hiatus after the 2010 Olympics.

Visual effects and digital production

Zoic is handling all of the visual effects for V, under the oversight of creative director and VFX supervisor Andrew Orloff (FlashForward, Fringe, CSI) and visual effects producer Karen Czukerberg (Eleventh Hour). Work on the pilot was split between Zoic’s Vancouver studio, which handled greenscreen and virtual sets, and the Los Angeles studio, where the motherships and other effects were created.

Zoic began work in February 2009 on the pilot, which featured about 240 effects shots, 125 of which involved live actors shot on greenscreen in Vancouver where the series is filmed. Another three episodes now in post-production have some 400 effects shots overall, half of which involve digital compositing of actors on greenscreen.

v-screencap-mothership_630x354A more detailed view of a Visitor mothership.

Orloff worked in collaboration with the show’s creators – Peters, Simoneau, and executive producers Steve Pearlman and Jace Hall – to design the motherships. The enormous, saucer-shaped Visitor mothership is one of the original V’s iconic images (along with a certain hamster), and visually represents the Visitors’ technological superiority and their domination over humanity. In addition, Orloff says, the creators were dedicated to realism and internal consistency and logic in the design of the alien technology and culture.

Orloff created the mothership on his laptop, working through numerous iterations with input from Peters and Simoneau. He wanted a design that was “freaky and menacing,” and would be emotionally impactful when it made its first momentous appearance onscreen.

v-screencap-mothership2_630x354The underside of a Visitor mothership begins its transformation. Buildings in Vancouver were supplemented with 3D models of real Manhattan skyscrapers from Zoic’s library.

Because the mothership itself is enormous, the 3D model used to represent it is huge and highly detailed. Zoic CG supervisor Chris Zapara (Terminator: The Sarah Connor Chronicles, Pathfinder) modeled the “transformation” effect, in which the ventral surface of the ship changes, causing the frightened humans below to fear an imminent attack. In fact, the ship is deploying an enormous video screen, displaying the greeting message of Visitor leader Anna (Baccarin). After many rounds of pre-visualizations, a design was chosen with large, movable panels and a grid of smaller panels arranged in a snakeskin pattern. The mothership was created in NewTek’s Lightwave 3D.

v-screencap-snakeskin_630x354The “snakeskin” panels underneath the mothership flip over to reveal a video projection surface.

Digital artist Steve Graves (Fringe, Sarah Connor Chronicles) was responsible for filling in the copious detail that gives the mothership the impression of immense scale. After the pilot was picked up by ABC, the dorsal surface was remodeled to add photorealism. The model initially was detailed only from the angles at which it was shown in the pilot, due to the many hours of work necessary. As shots were created for the second through fourth episodes, Graves created detail from new angles, and now the mothership model is complete.

v-screencap-reflection_630x354Our first view of the alien mothership, reflected in the glass of a skyscraper.

The mothership design was not the only way the Visitors’ arrival was made to seem momentous and frightening. As businessman Ryan Nichols (Morris Chestnut) looks to the skies for an explanation of various alarming occurrences, he first sees the mothership reflected in the glass windows of a skyscraper. Although a relatively simple effect (Zoic took shots of real buildings in Vancouver, skinned them with glass textures, and then put the reflected image on the glass), the effect on the viewer is chilling.

v-screencap-shipinterior_630x354Visitor leader Anna (Baccarin, seated left) is interviewed by Chad Decker (Scott Wolf, seated right) on board the Manhattan mothership. The “set” was created virtually, with the actors shot on a greenscreen stage.

Because the motherships are enormous, it only makes sense that they would feature enormous interior spaces. These sets would be too large to build, so half the effects shots on V involve actors filmed on a greenscreen stage with tracking markers. These virtual sets, based on Google Sketch-Up files from V‘s production designers (Ian Thomas (Fringe, The 4400) for the pilot; Stephen Geaghan (Journey to the Center of the Earth, The 4400) for later episodes), were created at Zoic’s Vancouver studio in Autodesk Maya and rendered in mental images’ mental ray.

The ship interiors were created before the related greenscreen shots were filmed. For the episodes shot after the pilot, Zoic provided the production with its new, cutting edge proprietary Zeus system, which allows filmmakers to see actors on a real-time rendered virtual set, right on the greenscreen stage. The technology is of immeasurable aid to the director of photography, crew, and especially the actors, who can see themselves interacting with the virtual set and can adjust their performances accordingly. Zeus incorporates Lightcraft Technology’s pre-visualization system.

After actors are filmed on the Vancouver greenscreen set and the show creators are happy with the pre-visualized scenes in Zeus, the data is sent south to Zoic’s Los Angeles studio, where the scenes are laid out in 3D. Then the data goes back up to Zoic in Vancouver, where the virtual set backgrounds are rendered in HD.

v-screencap-london_630x354An alien mothership inserted into a stock shot of London.

v-screencap-riodejaneiro_630x354A mothership composited into a stock shot of Rio de Janeiro, with matched lighting and atmospheric effects.

Other alien technology was created for the series, including shuttlecraft and a “seek & destroy” weapon used to target a resistance meeting.

v-screencap-shuttle_630x354A Visitor shuttle docks with a mothership.

The alien shuttle and the shuttle docking bays were created in Los Angeles by visual effects artist Michael Cliett (Fringe, Serenity), digital compositor Chris Irving and freelance artist James Ford.

v-screencap-atrium_630x354The “Atrium,” a city in the interior of a Visitor mothership.

The “Atrium,” a massive interior space inside the mothership, was created for Zoic by David R. Morton (The Curious Case of Benjamin Button, Serenity). The complex 3D model served essentially as a matte painting. It was incorporated into a complex composited shot, with actors on the greenscreen stage inserted into virtual sets of a corridor and balcony by the Vancouver studio; the camera pulls out to reveal the Atrium, which was created in LA. Extras in Visitor uniforms were shot on greenscreen and composited into the Atrium itself.

v-screencap-f16crash_630x354An F-16 fighter, its electronics disrupted by a Visitor mothership, crashes onto a city street.

An F-16 fighter crash, featured in the first few minutes of the pilot, was done by the Los Angeles studio. The airplane, automobiles, taxis, and Manhattan buildings in the background, and of course the explosion, smoke and particles, are all digital. All the components came from Zoic’s library. The actor was shot on a Vancouver street.

v-screencap-eye_630x354FBI Agent Erica Evans (Mitchell) examines a wounded Visitor and makes an alarming discovery.

A scene involving an injured Visitor, which gives the viewer one of the first clues to the aliens’ true nature, was shot entirely with practical effects (including the blinking eye). But Zoic used CG to enhance the wound, merge human skin with reptile skin, and add veins and other subcutaneous effects.

v-zoomout_469x630Visitor leader Anna looks out over her new dominion.

According to Czukerberg, one of the more difficult shots to pull off was the final scene in the pilot. It involves the alien leader, Anna (actress Morena Baccarin on the greenscreen stage), in an observation lounge on the mothership (virtual set); the camera pulls out (practical camera move) past the mothership windows to reveal the entire ship hovering over Manhattan (CG mothership over an original shot of the real Manhattan created for this production). The shot required cooperation between the LA and BC studios, and took a great deal of time and effort – “it was crazy,” Czukerberg said, but she adds that everyone involved is tremendously satisfied with the finished product.

Zoic Studios looks forward to doing more work when V returns next year, and helping the series become a ratings and critical success. “Rarely do you get an opportunity to redefine a classic series,” Orloff said. “Everyone at Zoic put their heart and soul into this show, and it shows on the screen.”

For more information: V on ABC; the first nine minutes of the pilot on Hulu; original series fan site.