The End of Rendering: Zoic Studios’ Aaron Sternlicht on Realtime Engines in VFX Production

Originally published on IDesignYourEyes on 1/6/2010.

Zoic created this Killzone 2 commercial spot entirely within the Killzone 2 engine.

The level of the technology available to produce computer graphics is approaching a new horizon, and video games are part of the equation.

Creators in 3D animation and visual effects are used to lengthy, hardware-intensive render times for the highest quality product. But increasingly, productions are turning to realtime rendering engines, inspired by the video games industry, to aid in on-set production and to create previz animations. Soon, even the final product will be rendered in realtime.

Aaron Sternlicht, Zoic Studios’ Executive Producer of Games, has been producing video game trailers, commercials, and cinematics since the turn of the millennium. He has charted the growth of realtime engines in 3D animation production, and is now part of Zoic’s effort to incorporate realtime into television VFX production, using the studio’s new ZEUS pipeline (read about ZEUS here).

Sternlicht explains how realtime engines are currently used at Zoic, and discusses the future of the technology.

“The majority of what we do for in-engine realtime rendering is for in-game cinematics and commercials. We can take a large amount of the heavy-lifting in CG production, and put it into a game engine. It allows for quick prototyping, and allows us to make rapid changes on-the-fly. We found that changing cameras, scenes, set-ups, even lighting can be a fraction of the workload that it is in traditional CG.

“Right now, you do give up some levels of quality, but when you’re doing something that’s stylized, cel-shaded, cartoonish, or that doesn’t need to be on a photo-realistic level, it’s a great tool and a cost effective one.

We’re going to be able to radically alter the cost structures of producing CG.

“Where we’re heading though, from a production standpoint, is being able to create a seamless production workflow, where you build the virtual set ahead of time; go to your greenscreen and motion capture shoot; and have realtime rendering of your characters, with lighting, within the virtual environment, shot by a professional DP, right there on-set. You can then send shots straight from the set to Editorial, and figure out exactly what you need to focus on for additional production — which can create incredible efficiencies.

“In relation to ZEUS, right now with [ABC’s sci-fi series] V, we’re able to composite greenscreen actors in realtime onto CG back plates that are coming straight out of the camera source. We’re getting all the camera and tracking data and compositing real-time, right there. Now if you combine that with CG characters that can be realtime, in-engine rendered, you then can have live action actors on greenscreen and CG characters fully lit, interacting and rendered all in realtime.

“People have been talking about realtime VFX for the last 15 years, but now it’s something you’re seeing actually happening. With V we have a really good opportunity. We’re providing realtime solutions in ways that haven’t been done before.

“Now there’s been a threshold to producing full CG episodic television. There has been a lot of interest in finding a solution to generate stylized and high quality CG that can be produced inexpensively, or at least efficiently. A process that allows someone to kick out 22 minutes of scripted full CG footage within a few weeks of production is very difficult to do right now, within budgetary realities.

“But with in-engine realtime productions, we can get a majority of our footage while we’re actually shooting the performance capture. This is where it gets really exciting, opening an entire new production workflow, and where I see the future of full CG productions.”

What game-based engines have Zoic used for realtime rendering?

“We’ve done a several productions using the Unreal 3 engine. We’ve done productions with the Killzone 2 engine as well. We’re testing out different proprietary systems, including StudioGPU’s MachStudio Pro, which is being created specifically with this type of work in mind.

“If you’re doing a car spot, you can come in here and say ‘okay, I want to see the new Dodge driving through the salt flats.’ We get your car model, transfer that to an engine, in an environment that’s lit and realtime rendered, within a day. We even hand you a camera, that a professional DP can actually shoot with on-site here, and you can produce final-quality footage within a couple of days. It’s pretty cool.”

How has the rise of realtime engines in professional production been influenced by the rise of amateur Machinima?

“I’ve been doing game trailers since 2000. I’ve been working with studios to design toolsets for in-game capture since then as well. What happened was, you had a mixture of the very apt and adept gamers who could go in and break code, or would use say the Unreal 2 engine, to create their own content. Very cool, very exciting.

“Concurrently, you had companies like Electronic Arts, and Epic, and other game studios and publishers increasing the value of their product by creating tool sets to let you capture and produce quality game play — marketing cameras that are spline-based, where you can adjust lighting and cameras on-the-fly. This provided a foundation of toolsets and production flow that has evolved into today’s in-engine solutions.”

It’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future.

How has this affected traditional producers of high-end software?

“It hasn’t really yet. There’s still a gap in quality. We can’t get the quality of a mental ray or RenderMan render out of a game engine right now.

“But the process is not just about realtime rendering, but also realtime workflow. For example, if we’re doing an Unreal 3 production, we may not be rendering in realtime. We’ll be using the engine to render, instead of 30 or 60 frames a second, we may render one frame every 25 seconds, because we’re using all the CPU power to render out that high-quality image. That said, the workflow is fully realtime, where we’re able to adjust lighting, shading, camera animation, tessellation, displacement maps — all realtime, in-engine, even though the final product may be rendering out at a non-realtime rate.

“Some of these engines, like Studio GPU, are rendering out passes. We actually get a frame-buffered pass system out of an engine, so we can do secondary composites.

“With the rise of GPU technology, it’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future. Artists, rather than waiting on renders to figure out how their dynamic lighting is working, or how their subsurface scattering is working, will dial that in, in realtime, make adjustments, and never actually have to render to review. It’s really remarkable.”

So how many years until the new kids in VFX production don’t even know what “render time” means?

“I think we’re talking about the next five years. Obviously there will be issues of how far we can push this and push that; and we’re always going to come up with something that will add one more layer to the complexity of any given scene. That said, yes, we’re going to be able to radically alter the cost structures of producing CG, and very much allow it to be a much more artist-driven. I think in the next five years… It’s all going to change.”

Read Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century.

Zoic Presents: The Creatures of ‘Fringe’ – Part 2

Originally published on I Design Your Eyes on 12/24/09.

lionzard_630x354

This is second part of a two-part interview with Zoic Studios senior compositor Johnathan R. Banta, about creatures designed for the Fox sci-fi drama Fringe. Be sure to read part one.

The Lionzard (from episode 1:16, “Unleashed”)

In this first-season episode, anarchists opposed to animal testing ransack a research laboratory, but get more than they bargain for when they unleash a ferocious transgenic creature. Later, Walter faces off against the creature in the sewers.

Banta says, “It was a lion-lizard combination, a chimera of a bunch of different creatures created in a lab. This also went through the ZBrush pipeline. There were no maquettes done for this particular one.

“This was a full-digital creature; luckily it did not interact too tightly with any of the actors. It was rigged up and had a muscle system that allowed for secondary dynamics. The textures and displacement maps were painted locally. There was some post lighting to add extra slime, with everything done inside the composite.

“It was actually very straightforward in its approach. The challenge of course was getting it to be lit properly and integrated in the shot. Compositing was a heavy challenge, as there was lot of haze on the set, a lot of lens flares – not direct flares, but gradients from different lights and so forth. We did our best to match the color space of the original photography. I think it was very effective.

“Another challenge was the bits of slime; it had to have slobber coming off of it. So we actually shot some practical elements; we did some digital cloth elements, a combination of things.”

monitorhand2_630x354

The Hand (from episode 1:12, “The No-Brainer”)

A seventeen-year-old is working at his computer and chatting on the phone, when a mysterious computer program executes. Strange images flash before his eyes, and the teen is drawn in, mesmerized. Something protrudes from the middle of the screen and impossibly takes the form of a hand. The unearthly appendage reaches forward without warning and grasps his face.

Banta explains: “This boy spends a little too much time on the computer, and a hand reaches out of the computer, grabs his face, and begins to jostle him around and melt his brain. Which is not unlike my experience as a youth.

“We made a series of maquettes and we photographed them, just different positions of the hand coming out; and we composited them into a couple of shots. At the same time the animation was being worked on in CG, so we could start previsualizing it and then composite it.

“A cloth simulation was used for the screen. The hand was coming out, and we would create several different morph targets based on that cloth simulation. There was a bone rig in there, so we could animate it grabbing the kid’s head. That’s some very effective work, especially when projecting the textures on. The side view of the hand coming out of the monitor is one of my favorite shots.

“What they had on set was a monitor made of plastic, and a greenscreen fabric with a slot in it [where the screen would be] – and they had some poor guy in a greenscreen suit shove his hand through and grab the kid on the head, and the kid wiggled around.

“So we had to paint back and remove the actor, whenever he was touching the kid; otherwise we would use a clean plate. But whenever he was touching the young actor, we would remove that hand and replace it.

“They were also flashing an interactive light on the young actor that was not accurate to what we were rendering. When the hand got close it would actually light up his face, because the hand was illuminated with television images. So we came up with a way of match-moving his animation, and using that to relight his performance. We had to match his animation for the hand to interact with him, but we also used that match move to relight his performance.“

tentacles2_630x354

The Tentacle Parasite (from episode 2:09, “Snakehead”)

A wet, shivering man frantically combs the streets of Boston’s Chinatown. Gaining refuge, he suffers incredible stomach pains. His rescuer puts on heavy gloves and uses shears to cut his shirt away. The man’s abdomen is distended and wriggling as something crawls around inside him. A squid-like parasite crawls out of the man’s mouth, and rescuer retrieves it.

“Recently we just did yet another thing coming out of a poor guy’s mouth,” Banta says. “This time it wasn’t just nice little potato-shaped slug — it was long and tentacled, had sharp bits and just looked pretty nasty to have shoved down your throat.”

But there was an additional challenge on this effect. “You were seeing the creature moving underneath the actor’s skin; the actor’s shirt was off, and he was wiggling around on the ground as he probably would if this were happening, like a dead fish. He was shifting all over the place, his skin was moving all over the place, and we had to actually take full control of that.

“So we did match move. We went to our performance transfer system, which essentially takes tracking information from the original plate and assigns is to the match move. There are no specific camera set-ups; it’s just whatever they give us, and we grab every bit of information from the plate that we can, and use that to modify the 3D performances. These were then projected onto animation that we used to distend the belly and so forth, and up into the throat.

“The creature had 18 tentacles. Ray Harryhausen, when he did an octopus, decided to take two of the tentacles off, because he wouldn’t have to animate those, it would take less time. We didn’t have that luxury. There was no way to procedurally animate these things, and it had to interact with the guy’s face. So we had the exact same challenge we had with the slug coming out of the mouth, that we had to take this actor and pull his face apart as well, and make his lips go wider. But this actor was moving a lot more, so the performance transfer and animation tracking was more challenging.

But I’m very pleased with the results. We used fabric simulations for the different bits of slime again.

razorbutterflies_630x354

Razor Butterflies (from episode 1:09, “The Dreamscape”)

A young executive arrives late to give a presentation. After he has finished and the boardroom empties, he collects his things, and spots a butterfly. It alights on his finger — and unexpectedly cuts him. The insect flutters by his neck — and cuts him again. After attacking a few more times, the creature disappears into an AC vent. The man peers into the vent just as a swarm of butterflies pours out. They surround him, cutting him all over his body — he runs in a mad panic, crashing through a plate glass window and falling to his death.

Banta says, “We tracked every camera in the scene and laid it out into one common environment, so we could reuse any lighting in any point in the scene. That gave us the ability to put the flock of razor-winged butterflies into the appropriate spot.

“A big challenge on its own was volume — controlling and dictating the flocking behavior, so the swarm would follow the actor, intersect with him in the appropriate parts and not intersect in others, and eventually chase him through the window where the would fall to his horrible demise.

“There was one close-up of a butterfly resting on his finger — it flew into frame and landed, it was brilliant – that was pretty straightforward in its execution. More often than not the hard part was controlling the sheer number of flocking butterflies, especially given our standard turnaround time.”

Banta is thrilled to be creating otherworldly monsters for JJ Abrams’ Fringe. “I like doing these creatures; I hope we get to do more!”

Read Part 1

Zoic Presents: The Creatures of ‘Fringe’ – Part 1

Originally published on I Design Your Eyes on 12/22/09.

heartbug_630x354

Part 2 of this post is now available.

Now in its second season, the Fox Network’s science fiction drama Fringe tells the story of three paranormal investigators for the FBI’s “Fringe Division” in Boston. Created by veteran television producer and feature film director JJ Abrams (Felicity, Alias, Lost; Star Trek), the cult favorite features a variety of bizarre and otherworldly creatures, many created with the help of Zoic studios.

Zoic senior compositor Johnathan R. Banta sat down with IDYE to discuss the creation of some of these monsters. His previous credits include Quarantine, The X Files: I Want to Believe, John Adams and V.

The Heartbug (from episode 1:07, “In Which We Meet Mr. Jones”)

johnathanbanta_300x400In this episode, a strange, other-worldly parasite mysteriously attaches itself to the internal organs of an FBI agent. The creature wraps itself around the man’s heart, and surgery must be performed to attempt to remove it.

Banta says, “We received artwork from production, done by a very good illustrator; and I set about making a maquette of the creature for two reasons. One, because it would help us understand what the form was — it was hard to figure it out from all the drawings, because in the multiple views we didn’t quite see how it meshed together at first. And secondly, it was fun. I just wanted to sculpt something and this seemed to be a prime opportunity for it.

“A couple of people did versions of it, one in [Luxology] modo, one in [Pixologic] ZBrush, just to kind of play around — they weren’t actually anything we used. The final model was made by [Zoic artist] Mike Kirylo.”

A great deal of work was done to allow the creature to move along with the beating heart. Scans of an actual beating human heart, provided by Zygote Media as a morph sequence, were used. “Mike had to figure out how to attach this creature to the heart,” Banta says, “and as it pulsated he would have a ‘softness’ in-between each of the hard shell [segments]. So there’s the hard carapace of the creature, and the soft squishy connective bits. Mike said he was able to find a way to make the bones between the different sections scale as the heart was beating. That way it stayed connected without being stretched.”

Everything we see inside the man’s chest is CG. “They had a prop on set that was over the top of an actor. Oddly enough, it was not in the place where the heart would actually be accessed. So for a wide shot we actually had to cut the actor down by a third of his original height, so that the hole would be in the appropriate spot to get to his heart. But for the close-ups it didn’t really matter. It was a piece of foam rubber with green paint inside of it, and we keyed that out and continued it into the cavity; and put in CG guts and an odd-shaped little bug.”

virusslug2_630x354
The Virus Slug (from episode 1:11, “Bound”)

In a lecture hall at Boston College, a biology professor gives a lecture about pathogens. In mid-sentence, he begins to choke and falls over. While his teaching assistant watches in horror, the professor’s throat becomes enlarged, and what looks like a massive slug crawls out of his open mouth. As the slimy creature slithers across the floor, students flee the hall in a panic.

Banta explains: “It’s a super-sized cold virus – a giant squishy slug with little cilia across its surface. This thing pulled itself out of his mouth, flopped onto the floor and squished away as quick as it could. It’s quite disgusting, and was played for dinnertime theater.

“It was a fairly simple model – a slug with a couple of things sticking out of it. But it had to maintain its volume and look like it was a rubbery object moving around, so there was a lot of finessing in the animation. We didn’t use any form of volume-preserving algorithms — other than Mike Kirylo — so it was all based on a really good animator.

“But the [professor’s] face was the interesting portion of it. This slug is rather large, and begins to distend his throat and pull his face into contorted positions that it wasn’t in originally, as the actor just basically laid there and flopped his head over to the side.

“We had to do an exact match move of the actor. We used our performance transfer system; projected the footage frame-for-frame onto our digital actor; and then we had the ability to push him around anywhere we needed to. Add a little bit of clever compositing, and next thing you know there’s a creature coming out of this man’s mouth.

“His movements were not tracked on stage — no tracking markers on him. They were tracked in post and match moved. Basically, we used every bit of detail that was available on his skin. Unfortunately, most actors don’t have very bad complexions.

“That’s something we’ve been doing a lot of, actually — digital makeup [for Fringe]. That all plays into what we’re doing with the creatures, because most of the time they are interacting directly with humans. They’re not just in the room walking around; they are becoming, or coming out of, or in some way touching people, for the most part.”

porcupineman_630x354

Porcuman (from episode 1:13, “The Transformation”)

In an airliner bathroom, a man shudders in pain as a hideous transformation begins. His teeth start falling out — then he screams in agony as giant quills pierce through the back of his shirt. The passengers on the plane react when the bathroom door splinters and a hideous, inhuman beast bursts into the cabin.

Banta: “This man on an airplane should learn not to experiment on himself; as a result he turns into a giant porcupine creature which brings the airplane down.

“It was in very few shots. It is originally modeled in ZBrush and Maya; we import the model, and it is rigged by our animation department and put through its paces. We run the standard passes that you would expect – diffuse, specular, ambient occlusion, fill passes, indirect lighting, those kinds of things, so that we can integrate it in the composite.

“A lot of times we’ll do what is called ‘RGB lighting,’ where every three lights will be either a prime red, a prime green, or a prime blue; and that way we have a lighting matte in every single render that we can use to do some tweaks in the composite. Also, since we’re getting normals rendered from our passes, we can use a plug-in from RE:Vision Effects to re-light the object. Whatever lighting passes that the CG department was not able to get to can be generated at the end.”

Banta notes that because of the nature of the effect, very little of the transformation involves practical, on-set elements. “This is all post at this point. They shoot it as if the creature were there — they just shoot it very naturally.

“Now that [Fringe has] a make-up crew that is known for doing creature work, there is a lot more practical stuff being done. But we have to exactly, precisely match with the practical elements when we do the CG. There are things that practical does so much better than we can do, and vice versa. It’s an all-in equation for me, because whatever works best, works best. There’s something about having a light bouncing off of a card onto a person on set holding this thing, which just gives it a sense of reality that we have to try to recreate.

“Porcuman was a combination of digital makeup with practical elements. It was a close interaction. During the transformation scene, we have a medium shot of the back, and then cut to a tight close-up of the shirt ripping as these giant porcupine spines come through it. They had an inflatable balloon on the back of the actor for the shirt; so we tracked that inflatable balloon; used our performance transfer to get that onto the back of the creature; and then animated spines coming out, and composited that underneath his shirt, which had a greenscreen on it.

“We had to do some warping of the cloth to get it to line up to the actual geometry of the creature. Then for the close-up of the shirt, instead of using the photography directly, we went with a cloth simulation of the shirt, and animated the spines. But we took sections of the torn cloth from the actual photography, and used those to sell that the tear is ripping a piece of fabric. This is a good example where something done practically pays off in spades, because we could just grab that tearing fabric and place it on each of the individual spines, and save ourselves a lot of simulation time.

Read Part 2!

Syd Dutton: Matte Painting from Traditional to Digital

Originally published on I Design Your Eyes on 12/4/09.

syddutton_630x354

It’s a cliché to call an artist “legendary,” but sometimes the word fits. Syd Dutton has been a leading matte painter for film and television for over three decades. His credits include Dune, The Running Man, Total Recall, the Addams Family films, Batman Forever, Star Trek: First Contact and Nemesis, U-571, The Fast and the Furious, The Time Machine, The Bourne Identity, and Serenity.

The Emmy-Award winner co-founded Illusion Arts in 1985, which created thousands of shots and matte paintings for over 200 feature films over 26 years. When Illusion Arts shut its doors earlier this year, Dutton and Zoic embraced the opportunity to collaborate, and Dutton became part of the team.

When I sat down to interview Dutton over coffee, it was with the intention of putting together some kind of grand post about the history of matte painting. But it’s far more interesting to let Syd speak for himself.

realgeniusb1_630x354

When I looked over your credits, I saw you worked on one of my favorite movies, Real Genius (1985).

That’s one of my favorites too. We did a practical matte for the B-1 bomber. [Val Kilmer and Gabe Jarret sneak onto a B-1 bomber on a military base.] In those days it was still paint on glass, and to get a sharp line for what was supposed to be the underbelly of the bomber, it had to be really sharp. But we were shooting at night. In order to black out the film, to do an original negative – you know what original negative work is, we needed two exposures — to get a real sharp line the matte had to be 50 feet out and 40 feet long. We spent several hours making it, putting cardboard where the belly had to go, making sure people would be underneath that line all the time. It was pretty fun.

But that was the coldest night of my entire career. I’ve been to some cold places, like Prague, but on that Van Nuys tarmac, that was the coldest ever.

dunegeidiprime_630x354

You also worked on David Lynch’s original Dune (1984). Can you tell me about the matte painting of the Harkonnen city on Giedi Prime?

Basically this was just a big painting. The people who are moving around were shot in a parking lot at Estudios Churubusco in Mexico City. Just one smoke and fire element was used over and over again. And then a car on a cable goes through the shot. The car was created by model maker Lynn Ledgerwood (The Bourne Identity), and measured about 8″ long.

The difficult thing about that was my partner and Al wanted to keep [film] as much as possible from being duplicated… we wanted to keep the painting on original film. So [we were] shooting the painting and making a very crude motion control frame-by-frame move; taking the film into an optical printer, trying to match the move through the optical printer; and then we put the people in and the smoke and the cable car. So we had to do a lot of adjustments, and we found that it had to be so exact, if we waited to shoot in the afternoon, the concrete floor had expanded. We had to shoot at a certain time in the morning, before the expansion occurred. So it was complicated, but we seemed to have lots of time in those days, and it was a fun painting to work on.

David Lynch would come by when I was painting it, and he would say “I like it, I want it dirtier.” He was always a nice guy, really a gentleman.

In your experience, what is the difference between working with traditional mattes versus digital?

There was a wonderful thing about doing original negative matte shots. You had to prepare the shot, and then you had to be committed to a matte line.

You had a whole bunch of test footage, and when the painting was completed you had to re-expose the same film, and hope that light bulbs didn’t burn out when you were shooting, or that the glass didn’t break. But it had a completeness to it, and so when you finished a matte shot, and when it came out like the ones in Real Genius — I thought they came out pretty well — there’s a great sense of completeness.

You made a long matte, you worked out the problems, you’ve been cold, you’ve endured that process, and you’ve gone through the photochemical process of developing the pieces of film, and working the matte line until it has disappeared. And finally you take a deep a breath and expose the two or three good takes that the director likes. And you put the worst one through first to make sure everything is working well, show it to the director, and then put the hero take through. Of course nobody had seen any footage unless they were shooting a B-cam, which they never did; and so it was kind of like the Catholic Church, where the director had to trust you that in two months or so you would have a finished product they would approve and like.

psycho3_630x354

Did you ever experience any disasters in that realm?

The only near disaster was when Tony Perkins — we were the first shot up on Psycho III (1986), that he was directing, and he was a real nice guy — and the first shot was this girl leaving a nunnery, and there was a piece of string that she had to follow so she would be on the [right side of the matte line].

And we had everything ready, the camera blocked off, and Tony Perkins came on the set. He came over to the camera, he looked through the lens and said “that’s perfect,” and then put all his body weight on the camera to lift himself up. And we said “we can roll in a few seconds!” I didn’t want to say “oh, you just [expletive deleted] up the shot!” We said, “oh, we need just a few more minutes of adjustment.” So we lied – we had to reset the matte, readjust the camera. If we had told him he had just screwed up the first shot of his movie, it was really bad luck. But that’s another shot that turned out well, I liked that shot a lot.

I can’t remember who the production designer was, I think it was Henry Bumstead [it was]. Everyone should know who Henry Bumstead was. He just died a while ago; he worked until 90, and died when he was 91 [in 2006]. He was Clint Eastwood’s favorite production designer, and in his 80s he designed Unforgiven — beautiful production design. Henry always made everything easy. Of course he had worked on Vertigo (1958).

According to IMDb, your first movie was Family Plot with Alfred Hitchcock?

Well, that was uncredited. I was hired by Albert Whitlock to work on The Hindenburg (1975) as a gopher, primarily, but then I came up with some ideas of my own, and Al liked them; so after Hindenburg Al made me his assistant. And Family Plot was again Henry Bumstead. Al really didn’t want to do the matte shot because he felt that it was – Hitchcock just wanted to show [actress] Karen Black what a matte shot was. It was a police station in San Francisco, a pretty easy matte shot; adding a second story, putting some what I would call “intelligent nonsense” in the background. So I painted that.

You had a fine art background?

Yeah. I went to Berkeley. Had a master’s degree. Had a wonderful time. Everything I learned, except for a sense of color, was totally useless when it came to matte painting. But it was still good to have that background. The best thing about going to Berkeley in those days was everyone wanted to be in San Francisco in the ‘60s. So I met people like Mark Rothko, pretty famous painters.

So what was it like to transition to digital, to have to train?

Oh, for me it was really hard. Rob Stromberg (2012, Avatar) was working for me at the time, and he embraced it really fast. I was just sort of afraid of it. I got used to it – it took me a while.

The people I know who were able to make the transition faster were people who like to draw things out. Bob Scifo (Fantastic Four: Rise of the Silver Surfer, The Abyss) for example is a wonderful matter painter. He has worked here for Zoic a couple of times. He came from the school where you drew everything out, and then painted it in. But he still got this incredible emotional result.

The way I learned to paint was the way Al Whitlock painted and Peter Ellenshaw (20,000 Leagues Under the Sea, The Black Hole) painted — you just started painting. Sometimes you didn’t know what you were going to paint, exactly; you knew what the subject matter was going to be — it might be a castle — but you just push paint around, and you start seeing things materialize – oh! I can see it now – and you let it dry, and try to bring all of it out of the fog. And that was a wonderful way to paint.

And in Photoshop, at least in the beginning, I couldn’t paint that way. I couldn’t make a big mess – it just stayed a big mess, I couldn’t refine it. The only way I could discover things and make a big mess was with Corel Painter; you can blend colors together and have accidents happen. And then at that point I usually finish the work in Photoshop.

When painting matte backgrounds now, you’re painting a painting, but there’s also the approach where you’re creating a 3D environment and making a 2D image from that.

Yeah. And there’s also projection – projecting a 2D painting onto objects. That’s another way to get camera movement. There’s no such thing anymore as a locked-down shot — that’s what matte paintings used to be. You would do everything in the world to make sure the camera didn’t move. And now people consider it a locked-off shot if they just hold the camera steady.

In the early days, you got to go out on location, sometimes to some really adventurous environments – a rock in the middle of some bay in Mexico; on a hillside in Europe somewhere. It was very physical, so you had that physical part. That part is now gone. Now I have to exercise to stay in shape, rather than just work. It was kind of dangerous, really – I didn’t think about it at the time.

There are no circumstances where they want you to go out and see the original location?

Not anymore. The visual effects supervisor will go to the locations, take photographs. He becomes the point man for every other department.

Does that feel like less involvement on your part?

Well, that’s the trade-off. The trade-off is that we can do now what we used to dream about doing. Which was, wouldn’t it be great if we could paint a grand, futuristic city and loop through it? Wouldn’t it be great if we could have a huge crowd running towards us in that shot? Rotoscope in a thousand people or something?

Things that we used to dream about, we can do now, but the trade-off is we don’t get to be as involved in the production as we once were. I talked to [Zoic co-founder] Loni [Peristere] about that. I said I feel bad for some of the kids here, that they’ll never be on the stage. It’s fun to be on location. He said the trade-off was they have all the tools to make their own movies. So, everything has a trade-off.

More info: Syd Dutton on IMDb.

Zoic Stops Time, Creates Historic ‘Frozen Moment’ Sequence for ‘CSI’ Premiere

Originally published on I Design Your Eyes on 10/6/09.

csi-opener-frozen-moment-5_630x354

On September 24th, CBS broadcast the premiere episode of the 10th season of its venerable crime procedural drama, CSI: Crime Scene Investigation . The episode cold opened with a lengthy, two-and-a-half minute long “frozen moment” sequence, showing us a single moment in a robbery attempt involving the main characters. This sequence, which made broadcast television history, and was created by Culver City, California’s Zoic Studios.

The camera starts in the morgue, flying through a water spray over a number of corpses on gurneys. The environment is in total disarray, with bodies falling out of the coolers, and smoke and debris floating in midair. We travel past a coroner screaming into a phone and around a corner, to find Doc Robbins (Robert David Hall, Starship Troopers) in mid-leap as he whacks one of the robbers in the head, sending the man’s weapon flying. The camera swoops through floating medical instruments past the first tableau and up into the ceiling. One floor up, we find the same chaos in the Lab, with the CSIs and lab techs frozen in mid-motion.

csi-opener-frozen-moment-1_630x354

csi-opener-frozen-moment-2_630x354

The camera continues past a book case tipping over, with falling curios, books and antiques suspended in shattered glass. Panning right and heading into the DNA Lab, the camera flies past one of the lab techs with a bullet exploding out of her shoulder, as she crashes through plate glass while suspended three feet off the ground. Wiping past her into the Lab proper, the camera finds Dr. Raymond Langston (Laurence Fishburne, The Matrix, Pee Wee’s Playhouse) kicking a second robber Morpheus-style through plate glass, while several rounds of ammunition leave trails of disturbed air in their wakes.

Flying smoothly past Catherine Willows (Marg Helgenberger, Species, Species II) and over an exploding lab experiment, the camera continues down the hall past David Hodges (Wallace Langham, Weird Science) and Wendy Simms (Liz Vassey, The Tick), who hang suspended horizontally in midair as they leap to avoid gunfire, and into the muzzle flash of the gun of another robber.

csi-opener-frozen-moment-6_630x354

csi-opener-frozen-moment-7_630x354

We transition from the muzzle flash to the glare of a flashlight held by yet another robber, and the camera trucks backwards out of a van with bullet holes and impact sparks all around. Pulling back the camera passes a robber firing at our last two CSIs, Nicholas Stokes (George Eads, ER) and Sara Grissom (Jorja Fox, ER, The West Wing), who fire back attempting to stop the theft of a body. The sequence finishes with the camera panning around to reveal Nick and Sara’s faces.

Naren Shankar, CSI’s executive producer, was impressed by a short film released in April, 2009 to promote Philips Cinema 21:9 LCD televisions. The short, entitled Carousel and produced by Adam Berg and London’s Stink Digital, was a two-minute, 19-second frozen moment sequence of police battling bank robbers dressed as scary clowns.

At CSI’s season nine wrap party, Shankar approached Zoic visual effects supervisor Rik Shorten, and asked if a similar scene could be created for the show. Shorten replied, “you write it, I’ll shoot it.”

The sequence was created as the cold open for the show’s 10th season premiere episode, “Family Affair.” Zoic had produced frozen moment shots for CSI before, but never a sequence of such complexity and length (it clocks in at two minutes, 17 seconds). The sequence was three script pages long, and required three full days of shooting on the main first unit stages, involving the primary cast members. Add to that a prep day, and an additional half-day to shoot the van tableau. The Philips spot had much greater resources – but CSI had an entire season of television to shoot. Shorten says that the producers provided Zoic with all the time, resources and support that could possibly be spared.

csi-opener-frozen-moment-3_630x354

The main three-day shoot employed four main motion control setups. A great deal of expense and effort went toward keeping the actors comfortable, and minimizing the time that talent would spend holding still will suspended in harness rigs.

The first portion to be shot, on days one and two, was the sequence in the DNA Lab. It was the largest and most complex set piece. Shorten and the Zoic team wrote and mapped out the shots on the prep day, giving them time to experiment on the day of the shoot, with blocking, track placement, lenses, the placement of practical elements and extras, etc. What the team learned on the first day was instrumental in making sure the rest of the sequence could be completed in the remaining days allowed.

The morgue tableau and hallway sequence were shot on day three. The hallway tableau featured Doc Robbins attacking a robber (called an “MIB” on set). Actor Hall was propped up on apple boxes and suspended by wires, while the camera moved slowly past on a track. While this one tableau sequence makes up about 20 seconds of the final product, the camera move took about three minutes to shoot.

For each shot, Shorten and his people wrote and planned out the shot with stand-ins; consulted with and got approval from the episode’s director, executive producer Kenneth Fink; ran a test shot on video; and then brought in the actors to shoot the real footage.

Some actors, like Helgenberger and Fishburne, only had to spend 10- to 15 minutes rigged up for their sequences. Other actors spend as long as a half hour held up by wires, stunt harnesses, boxes, greenscreen stands and articulated pads. Shorten says these rigs are never comfortable; and of course it’s not easy to hold perfectly still for minutes at a time. But everything possible was done to keep the actors in the rig for as brief a time as possible.

csi-opener-frozen-moment-8_630x354

The final tableau, of Nick and Sara firing on the van, was shot separately, taking a half a day. This scene was the most difficult and time-consuming to produce, as it was shot without motion control – no track, just a crane shot, and no clean plates. The paint-out and stilling of the actors for this sequence was an incredible amount of work. In fact, dozens of still photos of actors Eads and Fox were taken, and blended and morphed together to create motionless 3D elements of the two actors. These elements were then composited into the shot.

Shorten says that he is immensely proud of the work that he and Zoic did to create this unique and amazing sequence. “This could not have been accomplished without the incredible talents of every department on the show. Our production crew really came through, exceeding my expectations. Our excellent team of artists here at Zoic gave up their summer to create this fantastic sequence.

“I’m so grateful to everyone for their contributions. Most importantly, the show and the network are thrilled with the sequence, and the fan websites are still discussing the premier two months later – that’s the best compliment we could get!”

csi-opener-frozen-moment-4_630x354

More info: CSI on CBS.com ; Zoic Studios website; CSI on Hulu.

Zoic Brings Visitors to Earth for ABC’s ‘V’

Originally published on I Design Your Eyes on 10/2/09.

v-screencap-manhattanship_630x354A Visitor mothership hovers over Manhattan.

Tomorrow evening (11/3/09), ABC will broadcast the premiere episode of its highly anticipated new sci-fi series V, which updates and re-imagines the original 1983 miniseries of the same name. The visual effects for the new V were created by Culver City, California’s Zoic Studios, known for providing VFX for a number of well-loved science fiction franchises.

Scott Peters, creator of The 4400, brings fans a modern take on the classic V that pays loving homage to its 80s inspiration. Written by Peters and directed by Yves Simoneau, the pilot episode stars Elizabeth Mitchell (Lost), Morris Chestnut (Kung Fu Panda 2), Joel Gretsch (The 4400, Taken); and Firefly alumni Morena Baccarin and Alan Tudyk.

The remake hews closely to the story of the original: mile-wide alien motherships appear above the major cities of the Earth. The aliens call themselves “The Visitors,” and appear to be identical to humans. They claim to come in peace, seeking to trade advanced technology for resources. But the Visitors are not what they seem, and hide sinister intentions. While much of humanity welcomes the Visitors, a resistance movement begins to form.

Four episodes will air this month; the show will return from hiatus after the 2010 Olympics.

Visual effects and digital production

Zoic is handling all of the visual effects for V, under the oversight of creative director and VFX supervisor Andrew Orloff (FlashForward, Fringe, CSI) and visual effects producer Karen Czukerberg (Eleventh Hour). Work on the pilot was split between Zoic’s Vancouver studio, which handled greenscreen and virtual sets, and the Los Angeles studio, where the motherships and other effects were created.

Zoic began work in February 2009 on the pilot, which featured about 240 effects shots, 125 of which involved live actors shot on greenscreen in Vancouver where the series is filmed. Another three episodes now in post-production have some 400 effects shots overall, half of which involve digital compositing of actors on greenscreen.

v-screencap-mothership_630x354A more detailed view of a Visitor mothership.

Orloff worked in collaboration with the show’s creators – Peters, Simoneau, and executive producers Steve Pearlman and Jace Hall – to design the motherships. The enormous, saucer-shaped Visitor mothership is one of the original V’s iconic images (along with a certain hamster), and visually represents the Visitors’ technological superiority and their domination over humanity. In addition, Orloff says, the creators were dedicated to realism and internal consistency and logic in the design of the alien technology and culture.

Orloff created the mothership on his laptop, working through numerous iterations with input from Peters and Simoneau. He wanted a design that was “freaky and menacing,” and would be emotionally impactful when it made its first momentous appearance onscreen.

v-screencap-mothership2_630x354The underside of a Visitor mothership begins its transformation. Buildings in Vancouver were supplemented with 3D models of real Manhattan skyscrapers from Zoic’s library.

Because the mothership itself is enormous, the 3D model used to represent it is huge and highly detailed. Zoic CG supervisor Chris Zapara (Terminator: The Sarah Connor Chronicles, Pathfinder) modeled the “transformation” effect, in which the ventral surface of the ship changes, causing the frightened humans below to fear an imminent attack. In fact, the ship is deploying an enormous video screen, displaying the greeting message of Visitor leader Anna (Baccarin). After many rounds of pre-visualizations, a design was chosen with large, movable panels and a grid of smaller panels arranged in a snakeskin pattern. The mothership was created in NewTek’s Lightwave 3D.

v-screencap-snakeskin_630x354The “snakeskin” panels underneath the mothership flip over to reveal a video projection surface.

Digital artist Steve Graves (Fringe, Sarah Connor Chronicles) was responsible for filling in the copious detail that gives the mothership the impression of immense scale. After the pilot was picked up by ABC, the dorsal surface was remodeled to add photorealism. The model initially was detailed only from the angles at which it was shown in the pilot, due to the many hours of work necessary. As shots were created for the second through fourth episodes, Graves created detail from new angles, and now the mothership model is complete.

v-screencap-reflection_630x354Our first view of the alien mothership, reflected in the glass of a skyscraper.

The mothership design was not the only way the Visitors’ arrival was made to seem momentous and frightening. As businessman Ryan Nichols (Morris Chestnut) looks to the skies for an explanation of various alarming occurrences, he first sees the mothership reflected in the glass windows of a skyscraper. Although a relatively simple effect (Zoic took shots of real buildings in Vancouver, skinned them with glass textures, and then put the reflected image on the glass), the effect on the viewer is chilling.

v-screencap-shipinterior_630x354Visitor leader Anna (Baccarin, seated left) is interviewed by Chad Decker (Scott Wolf, seated right) on board the Manhattan mothership. The “set” was created virtually, with the actors shot on a greenscreen stage.

Because the motherships are enormous, it only makes sense that they would feature enormous interior spaces. These sets would be too large to build, so half the effects shots on V involve actors filmed on a greenscreen stage with tracking markers. These virtual sets, based on Google Sketch-Up files from V‘s production designers (Ian Thomas (Fringe, The 4400) for the pilot; Stephen Geaghan (Journey to the Center of the Earth, The 4400) for later episodes), were created at Zoic’s Vancouver studio in Autodesk Maya and rendered in mental images’ mental ray.

The ship interiors were created before the related greenscreen shots were filmed. For the episodes shot after the pilot, Zoic provided the production with its new, cutting edge proprietary Zeus system, which allows filmmakers to see actors on a real-time rendered virtual set, right on the greenscreen stage. The technology is of immeasurable aid to the director of photography, crew, and especially the actors, who can see themselves interacting with the virtual set and can adjust their performances accordingly. Zeus incorporates Lightcraft Technology’s pre-visualization system.

After actors are filmed on the Vancouver greenscreen set and the show creators are happy with the pre-visualized scenes in Zeus, the data is sent south to Zoic’s Los Angeles studio, where the scenes are laid out in 3D. Then the data goes back up to Zoic in Vancouver, where the virtual set backgrounds are rendered in HD.

v-screencap-london_630x354An alien mothership inserted into a stock shot of London.

v-screencap-riodejaneiro_630x354A mothership composited into a stock shot of Rio de Janeiro, with matched lighting and atmospheric effects.

Other alien technology was created for the series, including shuttlecraft and a “seek & destroy” weapon used to target a resistance meeting.

v-screencap-shuttle_630x354A Visitor shuttle docks with a mothership.

The alien shuttle and the shuttle docking bays were created in Los Angeles by visual effects artist Michael Cliett (Fringe, Serenity), digital compositor Chris Irving and freelance artist James Ford.

v-screencap-atrium_630x354The “Atrium,” a city in the interior of a Visitor mothership.

The “Atrium,” a massive interior space inside the mothership, was created for Zoic by David R. Morton (The Curious Case of Benjamin Button, Serenity). The complex 3D model served essentially as a matte painting. It was incorporated into a complex composited shot, with actors on the greenscreen stage inserted into virtual sets of a corridor and balcony by the Vancouver studio; the camera pulls out to reveal the Atrium, which was created in LA. Extras in Visitor uniforms were shot on greenscreen and composited into the Atrium itself.

v-screencap-f16crash_630x354An F-16 fighter, its electronics disrupted by a Visitor mothership, crashes onto a city street.

An F-16 fighter crash, featured in the first few minutes of the pilot, was done by the Los Angeles studio. The airplane, automobiles, taxis, and Manhattan buildings in the background, and of course the explosion, smoke and particles, are all digital. All the components came from Zoic’s library. The actor was shot on a Vancouver street.

v-screencap-eye_630x354FBI Agent Erica Evans (Mitchell) examines a wounded Visitor and makes an alarming discovery.

A scene involving an injured Visitor, which gives the viewer one of the first clues to the aliens’ true nature, was shot entirely with practical effects (including the blinking eye). But Zoic used CG to enhance the wound, merge human skin with reptile skin, and add veins and other subcutaneous effects.

v-zoomout_469x630Visitor leader Anna looks out over her new dominion.

According to Czukerberg, one of the more difficult shots to pull off was the final scene in the pilot. It involves the alien leader, Anna (actress Morena Baccarin on the greenscreen stage), in an observation lounge on the mothership (virtual set); the camera pulls out (practical camera move) past the mothership windows to reveal the entire ship hovering over Manhattan (CG mothership over an original shot of the real Manhattan created for this production). The shot required cooperation between the LA and BC studios, and took a great deal of time and effort – “it was crazy,” Czukerberg said, but she adds that everyone involved is tremendously satisfied with the finished product.

Zoic Studios looks forward to doing more work when V returns next year, and helping the series become a ratings and critical success. “Rarely do you get an opportunity to redefine a classic series,” Orloff said. “Everyone at Zoic put their heart and soul into this show, and it shows on the screen.”

For more information: V on ABC; the first nine minutes of the pilot on Hulu; original series fan site.