‘FlashForward’ Flashback: Zoic Studios’ Steve Meyer on the Award-Nominated VFX for the Pilot

Posted by Erik Even in I Design Your Eyes on April 1, 2010

Property of ABC; screencap from the Zoic Television Reel.

Based on the science fiction novel by Robert J. Sawyer, ABC’s FlashForward tells the story of the aftermath of a bizarre global event. For 137 seconds, every person on Earth (except perhaps one) loses consciousness, and experiences visions of their own future.

The pilot episode presents the immediate aftermath of the worldwide disaster, with the consequences of the worldwide blackout – millions of deaths due to traffic collisions, crashed aircraft, and other accidents. Star Joseph Fiennes, portraying FBI agent Mark Benford, survives an auto wreck and looks out over a chaotic Los Angeles cityscape. Culver City, California’s Zoic Studios was tapped to create the disastrous tableau; the company’s work on the episode was nominated for two VES awards, for Outstanding Supporting Visual Effects in a Broadcast Program and for Outstanding Created Environment in a Broadcast Program or Commercial. Continue reading

From ‘2001’ to ‘CSI’: Zoic Studios’ Rik Shorten on Motion Control for VFX

Originally published on I Design Your Eyes on 3/12/10.

motioncontrol_haze_630x354

In cinematography, motion control is the use of computerized automation to allow precise control of, and repetition of, camera movements. It is often used to facilitate visual effects photography.

I spoke with Rik Shorten, visual effects supervisor at Culver City, California’s Zoic Studios, about his use of motion control and how the technology has changed since it was introduced over three decades ago. Shorten produces motion-controlled effects for CBS’ visually-groundbreaking forensic drama, CSI: Crime Scene Investigation. He recently took home a VES Award for Outstanding Supporting Visual Effects in a Broadcast Program for his work on the “frozen moment” sequence in CSI’s tenth-season opener.

“I didn’t work on the original 2001: A Space Odyssey,” Shorten says, “or back in the Star Wars days in the 70s when computer-controlled cameras were first developed. But fundamentally, the way the technology works hasn’t changed. The rig I use almost weekly on CSI is the original rig from [David Lynch’s 1984 science-fiction film] Dune. The Kuper controller, the head that controls the rig, runs on MS-DOS. It’s a really old-school programming language that they used for these original systems, that hasn’t really changed because it hasn’t had to. It’s a coordinate-based system, XYZ, and we can write the moves we need. The way we do it today is the same way they would have programmed it 20 years ago. So from that perspective, the technology hasn’t increased.

The way the technology works hasn’t changed. The rig I use on CSI is the original rig from 1984’s Dune

“What has changed is that the rigs have gotten smaller, lighter and quieter. They now have the ability to run silently — they used to be so loud that you couldn’t record dialogue with them. Today we have smaller rigs that can fit through doorways – they’re made out of carbon fiber pieces now, they’re not the behemoths they used to be — and you can have actors delivering dialogue in the same scene as a motion-control move.

“So how we use them hasn’t changed, but slowly but surely they’ve made progress. Some weeks I use the old rigs versus the new ones, because that’s all I need, and the moves are simple enough. We have a system that’s 30 years old that we use side-by-side with a system built three or four years ago. And we interchange them based on the needs of the shot.

“As far as CSI, there are two ways we use motion control. We do stand-alone in-camera shots with the motion-control rig, where we’re flying over a prosthetic, or we’re traveling around a prop, and we need to get a macro shot; we use them a lot for macro photography. We have a couple of different snorkel lenses we use on the systems; one’s an endoscopic lens, and one is a probe lens, and they were both designed for medical photography. They’re both barrel lenses, about 12” long. The endoscopic lens is a fixed lens, sort of a wide-angle lens; it’s a tiny skinny little lens you can stick through a donut hole. If you see any shots where the camera goes in-between something where it seems like it shouldn’t go, that’s the lens we use.

“The probe lens is a little bit bigger, but it has multiple lens sizes, so we can go as wide as a 9mm or 12mm lens, for super-wide shots; right up to your prime lenses, your 22s, 25s, 30s, whatever it is. We use that in the same way, for getting into tight spots, and for getting macro, because the close-focus on these lenses is only about six inches. That’s a lot tighter than a normal lens can get.

“We use these cameras to get that sort of fantastical camera move that a Steadicam or a dolly couldn’t do. So when it’s got to rotate on three axes and fly in, that’s when we’ll program something in motion control. It’s like a 3D rendered camera, but we’re actually shooting it in real life. It frees us up to do more aggressive, creative moves.

motioncontrolcsi_630x354Rik Shorten with Director of Photography David Drzewiecki (center); with unidentified crew member and actress.

“The other way we use motion control is for multiple passes — like in the old days where they did three or four passes of the starship Enterprise with different lighting setups, and combined them all later. We don’t do much of that these days, at least in television; I’m sure they still do it in features. We use it for multiple layering. We’ll do the same scene with different elements in three or four passes, all broken apart with the same repeatable move; then we’ll put them all back together so we can affect the different elements in different ways.

“We do ghost shots every week. We’ll have a production plate without a foreground actor in it, just a background. We’ll track that plate here at Zoic. The data is then converted to XYZ coordinate data — ASCII files that MS-DOS can read off old-school 3½“ floppies — so the Kuper controller on the motion control rig can mimic the camera move from the track plate that we shot in first unit. When I put that data in, and I have my background plate and my video setup, I run them together and they’ll run at the same time. The camera will mimic what the first unit camera did.

“Let’s say a guy is firing a gun in front of the greenscreen, and he’s supposed to be a ghost image superimposed into the scene. I’ll shoot him on greenscreen, with that tracked camera move; and then when I come back here to Zoic, I’ve got a motion control pass on the greenscreen, and I’ve got my first unit plate, and the two line up perfectly. That’s how we get all the stylized transition pieces, and all those layers that CSI uses to great effect, because we have the capacity to translate and then to reshoot at a later date using the motion control system.

Recreating a scene on the greenscreen that was shot in the field is always a challenge…

“The first time I saw this used was in [1996’s] Multiplicity with Michael Keaton – that’s when I saw this tech first exploding, having the same person in the same scene, over and over. There were a lot of production cheats used for years, with locked plates and simple split-screen; but this technology allows you to travel 360 degrees around somebody, and go into the scene and come back out of the scene; and people can cross and interact and do other things, that they could not do without this system. If you’re using live action elements for these high-concept shots, then motion control is the only way to do it.”

Shorten says that precision is an important issue, just as it was with traditional locked-plate shots. “Sometimes we don’t have the exact lens, we’re off by a few mils. Say they used a 50mm lens and I only have a 45mm, sometimes there’s a little eye matching that needs to happen. To say it’s plug-and-play is disingenuous. You need to understand the limitations of the system.

“Recreating a scene on the greenscreen that was shot in the field is always a challenge. You need to expect there’s going to be some compensation; you’re going to have to do a little eye matching, playing shots back and doing an A-over-B in our video assist, and then adjusting your frame rates and composition, adjusting the speed of the moves. A lot of times, even with the track data, we’ll have to make some on-the-fly compensations to get things to sit in there correctly.

“We do surveys on location, as far as distances to camera and understanding where the actors are supposed to be in the greenscreen instance. How far away from the camera is the actor supposed to be in the scene, that’s where we start. When we transfer the data, we have a general idea that the camera’s six feet high, it’s five degrees tilted up, and 22 feet from our subject. But when you get it in the studio and do the A-over-B, you might realize that you need to be zero instead of five degrees, or you need to be four feet closer, or you need to change your lens a little bit. The elements have to line up visually, not just by the numbers, so they’re actually going to work when you look at the images together.”

Shorten says that some problems with matching can be fixed digitally. “There is a lot we can work with digitally. It’s not very often we will shoot something in motion control, come back here and have to throw it out completely. Usually it’s salvageable, even if we’re off for some reason.”

Shorten’s greatest challenge is in helping the television production community become comfortable with motion capture technology. “There is still a fear of using this technology, even though it’s been around for years, because it’s still considered to be the domain of feature films, and commercials and music videos that have more time and money than most productions believe they have.

motioncontrol_shorten_630x354Rik Shorten on the stage.

“There is an education process, that we’ve been quietly working on for a long time. ‘Motion control’ really is a four-letter word for a lot of production managers, who say ‘I don’t have the time, I don’t know the technology, I don’t know how to use it, I don’t know why I need it, and you’re going to kill my one-liner if I have to take five hours to set up a motion control shot. We just won’t do it.’ We run up against this all the time.

“And this is even on shows like CSI, which is comfortable with the technology. Every ninth day we have a motion control day, even if they are simple in-camera things. They understand it, but they will bounce shots. When I suggest taking my motion control off my second unit day, and putting it on set – as soon as I’m doing it with main unit actors, in the middle of the day when there’s 150 crew around, suddenly even shows that are comfortable with the technology get very nervous.

Definitely there’s a lot of apprehension, but it’s such a great technology…

“The hope is that with these smaller and quieter rigs, with the idea that we can do pre-viz and set surveys so that when we show up on set we know exactly where our rig is going to go, we can get in and get set up very quickly, and start rolling video takes to show a director within a couple of hours. We can have our pre-viz and our moves written, if we do our surveying correctly, so that we’re not starting from scratch. We don’t need a week to do a motion control move.

“Definitely there’s a lot of apprehension, but it’s such a great technology. These shots can’t be accomplished any other way, without costing too much. If you don’t shoot it this way, if you try to back into it later, you need all kinds of digital fixes and compromises. You spend money somewhere. Getting it in-camera, and doing as much as you can physically — for a lot of set-ups motion control is head-and-shoulders above any other technique, from a financial standpoint and for the way it’s going to come out for your show.

“It’s about trying to build that trust and that faith with productions. We’re not suggesting motion control because we want to noodle around with computer-controlled cameras; it’s because it really is the best way to achieve your shot, and get the elements we need to make something really dynamic for your show.”

More info: Zoic Studios Wins Big at 2010 VES Awards; Zoic Stops Time, Creates Historic ‘Frozen Moment’ Sequence for CBS’ ‘CSI’ Premiere.