‘FlashForward’ Flashback: Zoic Studios’ Steve Meyer on the Award-Nominated VFX for the Pilot

Posted by Erik Even in I Design Your Eyes on April 1, 2010

Property of ABC; screencap from the Zoic Television Reel.

Based on the science fiction novel by Robert J. Sawyer, ABC’s FlashForward tells the story of the aftermath of a bizarre global event. For 137 seconds, every person on Earth (except perhaps one) loses consciousness, and experiences visions of their own future.

The pilot episode presents the immediate aftermath of the worldwide disaster, with the consequences of the worldwide blackout – millions of deaths due to traffic collisions, crashed aircraft, and other accidents. Star Joseph Fiennes, portraying FBI agent Mark Benford, survives an auto wreck and looks out over a chaotic Los Angeles cityscape. Culver City, California’s Zoic Studios was tapped to create the disastrous tableau; the company’s work on the episode was nominated for two VES awards, for Outstanding Supporting Visual Effects in a Broadcast Program and for Outstanding Created Environment in a Broadcast Program or Commercial.

Zoic VFX Supervisor Steve Meyer discusses the creation of the complex scene, which required a tremendous amount of rotoscoping and motion tracking. The amount of roto was necessary, Meyer says, “because they shot in downtown LA, looking from the 4th St. overpass, over the southbound and northbound 110 Freeway. So naturally you can’t stop all that traffic, or get a greenscreen up.

“There’s a big, swooping hero shot following Joseph Fiennes as he jumps up on a car and looks down, and it’s a huge vista. Then we reverse it and look northbound. Everything in the foreground on that overpass we had to roto out. We had a team of seven or eight people going for weeks, just rotoing that and a bunch of other shots.

“We ended up having to remove all the traffic on the freeway; then added in overturned cars, cars burning, flames, smoke, helicopters crashing, just debris everywhere. We had to build a 3D matte painting in that environment. There’s a lot of detail – you can look at the shot over and over, and always see something new.”

The production brought on an experienced feature film matte painter, Roger Kupelian (2012, Alice in Wonderland) to create a “road map” of the shot. “Kupelian took a still of the freeway overpass shots, and he just dialed it in with Kevin Blank, the VFX supervisor – we want smoke here, we want the helicopter to hit here, we want fire and destruction here, we want this tree burning. They gave us a template – this is what we need it to look like. Kupelian sent us the files, and sometimes we used his elements.

“There are about 45 shots we ended up doing for the pilot, and the majority of them were for that overpass sequence. For most of the shots, nothing was locked off – every shot had some sort of roto, because there was no greenscreen. We had to roto everything to build the shots, and then try to match the smoke, fire, debris, people and other elements from shot to shot.

“We were working with different formats — stock footage, film footage, the Red Camera — trying to mix all these different formats to create one environment.”

All the scenes were tracked in Andersson Technologies’ SynthEyes camera tracking software, from which the team was able to build a 3D environment. The artists used this 3D information in Adobe After Effects to rebuild the plate with clean pieces of freeway, overpasses, signs, etc.

“The smoke was a combination of digital photographs, Google images, CG smoke, and moving elements that we had in our vault. Some of the smoke was a dust cloud that we slowed down. One of the smoke passes was a photo of a brush fire I took up by my house with my iPhone. I took the image, gave it to one of my compositors, and said ‘this will look good off in the distance.’ It’s so far off you don’t see it moving, so it fit in fine.

“They shot lots of people on greenscreen, and they all needed to have the right camera lens perspective. They’re way off in the distance; you have to get up close to an HD screen to see them. But we didn’t want any nuances to be overlooked. We don’t want to shoot a person head-on when the camera is going to be looking down at them.

“Another complication with the overpass sequence was that it was shot on a bright day, so we had a lot of technical problems. If you look at someone up against a bright sky, the sun wraps around them a bit, like a halo – and we were trying to put a dark smoke cloud behind them. It just doesn’t work right. We had a lot of technical things to try to work through when we ran into those kinds of problems. Every shot had to be 3D tracked. We took that 3D track into our environment, and we placed things in our 3D world shot by shot by shot.

“We also had a CG tanker in there that blew up. They actually had a real tanker with a big hole in it, and they threw in six gallons of gasoline and lit it and boom! It was huge. We had to put the shell of the tanker on there before the practical explosion; and then we just blew it up in Autodesk Maya and added CG debris, camera shake, heat ripple and dynamic smoke trails; plus glass shattering on the buildings and other background effects.”

Property of ABC; screencap from the Zoic Television Reel.

Wreckage from the tanker explosion strikes an overturned car and knocks it off the overpass onto the freeway below. Zoic created the car in CG. “They shot everybody running up to the guardrail and looking over,” Meyer explains. “We had to remove the railing and put in our own CG railing, so when the car goes down it takes it with it. So we had the complicated roto of recreating the people’s bodies that were behind the railing. We had to rebuild lots of people’s legs and waists. We put in the smoke and stuff that dynamically reacts to the car, so when it gets sucked down it creates a vortex and pulls the smoke down. Also, there’s an orange cart right nearby. We try to get every detail right, so when the car goes down we have a couple of oranges that roll away with it.

Property of ABC; screencap from the Zoic Television Reel.

“Then we had the falling LAPD helicopter. We took a panoramic image of a building, so we’re working on one frame and can do a pan-and-tilt in post. The helicopter has already crashed into the building, and we needed to have the smoke barreling out through most of the sequence, with the rotor blades still spinning — then we get to a certain point, and there’s an explosion that pushes the helicopter out. It tumbles and it’s scraping the building, tearing it apart and opening it up.

“Roger Kupelian labored intensely on a matte painting of the inside of the building, with what would be exposed – wires, beams, pipes, office equipment. As the CG helicopter was falling down the face of the building and opening it up, our compositor just revealed it with little mattes. At the same time we threw in sparks, debris, dust and smoke. It ended up pretty good. It was tough making an animation that made everyone happy, but in the end it looked great on the big screen at the viewing.

“Some of the shots were fun, because you can really push the envelope — let’s see what happens when we do this or when we do that. It took a lot of planning and careful choreography to between our 2D and 3D teams to keep the action and look continuous. Our teams worked tirelessly to create seamless product because anything out of place would be glaring.

“This isn’t ‘sci-fi’ with spaceships and aliens,” Meyers says, “which allow a bit of imagination – but rather, real-life vehicles, smoke, fire, people and buildings that have to look real.”

More info: FlashForward on ABC.com; the latest Zoic Studios Television Reel on ZoicStudios.com; Steven Meyer on IMDb.

From ‘2001’ to ‘CSI’: Zoic Studios’ Rik Shorten on Motion Control for VFX

Originally published on I Design Your Eyes on 3/12/10.

motioncontrol_haze_630x354

In cinematography, motion control is the use of computerized automation to allow precise control of, and repetition of, camera movements. It is often used to facilitate visual effects photography.

I spoke with Rik Shorten, visual effects supervisor at Culver City, California’s Zoic Studios, about his use of motion control and how the technology has changed since it was introduced over three decades ago. Shorten produces motion-controlled effects for CBS’ visually-groundbreaking forensic drama, CSI: Crime Scene Investigation. He recently took home a VES Award for Outstanding Supporting Visual Effects in a Broadcast Program for his work on the “frozen moment” sequence in CSI’s tenth-season opener.

“I didn’t work on the original 2001: A Space Odyssey,” Shorten says, “or back in the Star Wars days in the 70s when computer-controlled cameras were first developed. But fundamentally, the way the technology works hasn’t changed. The rig I use almost weekly on CSI is the original rig from [David Lynch’s 1984 science-fiction film] Dune. The Kuper controller, the head that controls the rig, runs on MS-DOS. It’s a really old-school programming language that they used for these original systems, that hasn’t really changed because it hasn’t had to. It’s a coordinate-based system, XYZ, and we can write the moves we need. The way we do it today is the same way they would have programmed it 20 years ago. So from that perspective, the technology hasn’t increased.

The way the technology works hasn’t changed. The rig I use on CSI is the original rig from 1984’s Dune

“What has changed is that the rigs have gotten smaller, lighter and quieter. They now have the ability to run silently — they used to be so loud that you couldn’t record dialogue with them. Today we have smaller rigs that can fit through doorways – they’re made out of carbon fiber pieces now, they’re not the behemoths they used to be — and you can have actors delivering dialogue in the same scene as a motion-control move.

“So how we use them hasn’t changed, but slowly but surely they’ve made progress. Some weeks I use the old rigs versus the new ones, because that’s all I need, and the moves are simple enough. We have a system that’s 30 years old that we use side-by-side with a system built three or four years ago. And we interchange them based on the needs of the shot.

“As far as CSI, there are two ways we use motion control. We do stand-alone in-camera shots with the motion-control rig, where we’re flying over a prosthetic, or we’re traveling around a prop, and we need to get a macro shot; we use them a lot for macro photography. We have a couple of different snorkel lenses we use on the systems; one’s an endoscopic lens, and one is a probe lens, and they were both designed for medical photography. They’re both barrel lenses, about 12” long. The endoscopic lens is a fixed lens, sort of a wide-angle lens; it’s a tiny skinny little lens you can stick through a donut hole. If you see any shots where the camera goes in-between something where it seems like it shouldn’t go, that’s the lens we use.

“The probe lens is a little bit bigger, but it has multiple lens sizes, so we can go as wide as a 9mm or 12mm lens, for super-wide shots; right up to your prime lenses, your 22s, 25s, 30s, whatever it is. We use that in the same way, for getting into tight spots, and for getting macro, because the close-focus on these lenses is only about six inches. That’s a lot tighter than a normal lens can get.

“We use these cameras to get that sort of fantastical camera move that a Steadicam or a dolly couldn’t do. So when it’s got to rotate on three axes and fly in, that’s when we’ll program something in motion control. It’s like a 3D rendered camera, but we’re actually shooting it in real life. It frees us up to do more aggressive, creative moves.

motioncontrolcsi_630x354Rik Shorten with Director of Photography David Drzewiecki (center); with unidentified crew member and actress.

“The other way we use motion control is for multiple passes — like in the old days where they did three or four passes of the starship Enterprise with different lighting setups, and combined them all later. We don’t do much of that these days, at least in television; I’m sure they still do it in features. We use it for multiple layering. We’ll do the same scene with different elements in three or four passes, all broken apart with the same repeatable move; then we’ll put them all back together so we can affect the different elements in different ways.

“We do ghost shots every week. We’ll have a production plate without a foreground actor in it, just a background. We’ll track that plate here at Zoic. The data is then converted to XYZ coordinate data — ASCII files that MS-DOS can read off old-school 3½“ floppies — so the Kuper controller on the motion control rig can mimic the camera move from the track plate that we shot in first unit. When I put that data in, and I have my background plate and my video setup, I run them together and they’ll run at the same time. The camera will mimic what the first unit camera did.

“Let’s say a guy is firing a gun in front of the greenscreen, and he’s supposed to be a ghost image superimposed into the scene. I’ll shoot him on greenscreen, with that tracked camera move; and then when I come back here to Zoic, I’ve got a motion control pass on the greenscreen, and I’ve got my first unit plate, and the two line up perfectly. That’s how we get all the stylized transition pieces, and all those layers that CSI uses to great effect, because we have the capacity to translate and then to reshoot at a later date using the motion control system.

Recreating a scene on the greenscreen that was shot in the field is always a challenge…

“The first time I saw this used was in [1996’s] Multiplicity with Michael Keaton – that’s when I saw this tech first exploding, having the same person in the same scene, over and over. There were a lot of production cheats used for years, with locked plates and simple split-screen; but this technology allows you to travel 360 degrees around somebody, and go into the scene and come back out of the scene; and people can cross and interact and do other things, that they could not do without this system. If you’re using live action elements for these high-concept shots, then motion control is the only way to do it.”

Shorten says that precision is an important issue, just as it was with traditional locked-plate shots. “Sometimes we don’t have the exact lens, we’re off by a few mils. Say they used a 50mm lens and I only have a 45mm, sometimes there’s a little eye matching that needs to happen. To say it’s plug-and-play is disingenuous. You need to understand the limitations of the system.

“Recreating a scene on the greenscreen that was shot in the field is always a challenge. You need to expect there’s going to be some compensation; you’re going to have to do a little eye matching, playing shots back and doing an A-over-B in our video assist, and then adjusting your frame rates and composition, adjusting the speed of the moves. A lot of times, even with the track data, we’ll have to make some on-the-fly compensations to get things to sit in there correctly.

“We do surveys on location, as far as distances to camera and understanding where the actors are supposed to be in the greenscreen instance. How far away from the camera is the actor supposed to be in the scene, that’s where we start. When we transfer the data, we have a general idea that the camera’s six feet high, it’s five degrees tilted up, and 22 feet from our subject. But when you get it in the studio and do the A-over-B, you might realize that you need to be zero instead of five degrees, or you need to be four feet closer, or you need to change your lens a little bit. The elements have to line up visually, not just by the numbers, so they’re actually going to work when you look at the images together.”

Shorten says that some problems with matching can be fixed digitally. “There is a lot we can work with digitally. It’s not very often we will shoot something in motion control, come back here and have to throw it out completely. Usually it’s salvageable, even if we’re off for some reason.”

Shorten’s greatest challenge is in helping the television production community become comfortable with motion capture technology. “There is still a fear of using this technology, even though it’s been around for years, because it’s still considered to be the domain of feature films, and commercials and music videos that have more time and money than most productions believe they have.

motioncontrol_shorten_630x354Rik Shorten on the stage.

“There is an education process, that we’ve been quietly working on for a long time. ‘Motion control’ really is a four-letter word for a lot of production managers, who say ‘I don’t have the time, I don’t know the technology, I don’t know how to use it, I don’t know why I need it, and you’re going to kill my one-liner if I have to take five hours to set up a motion control shot. We just won’t do it.’ We run up against this all the time.

“And this is even on shows like CSI, which is comfortable with the technology. Every ninth day we have a motion control day, even if they are simple in-camera things. They understand it, but they will bounce shots. When I suggest taking my motion control off my second unit day, and putting it on set – as soon as I’m doing it with main unit actors, in the middle of the day when there’s 150 crew around, suddenly even shows that are comfortable with the technology get very nervous.

Definitely there’s a lot of apprehension, but it’s such a great technology…

“The hope is that with these smaller and quieter rigs, with the idea that we can do pre-viz and set surveys so that when we show up on set we know exactly where our rig is going to go, we can get in and get set up very quickly, and start rolling video takes to show a director within a couple of hours. We can have our pre-viz and our moves written, if we do our surveying correctly, so that we’re not starting from scratch. We don’t need a week to do a motion control move.

“Definitely there’s a lot of apprehension, but it’s such a great technology. These shots can’t be accomplished any other way, without costing too much. If you don’t shoot it this way, if you try to back into it later, you need all kinds of digital fixes and compromises. You spend money somewhere. Getting it in-camera, and doing as much as you can physically — for a lot of set-ups motion control is head-and-shoulders above any other technique, from a financial standpoint and for the way it’s going to come out for your show.

“It’s about trying to build that trust and that faith with productions. We’re not suggesting motion control because we want to noodle around with computer-controlled cameras; it’s because it really is the best way to achieve your shot, and get the elements we need to make something really dynamic for your show.”

More info: Zoic Studios Wins Big at 2010 VES Awards; Zoic Stops Time, Creates Historic ‘Frozen Moment’ Sequence for CBS’ ‘CSI’ Premiere.