Ten Famous Science Fiction Properties That Would Make Great VFX Movies — Part 4 ‘The Airtight Garage’


This is a series of posts discussing ten existing science fiction properties (from literature, animation, games and comics) that could serve as the basis for ground-breaking live-action VFX films and television shows. This time: Jean “Moebius” Giraud’s 1976 graphic novel
The Airtight Garage.

For an explanation of the choices for this list, see the first entry.

Number 7 of 10: The Airtight Garage (US title, comic, 1976), aka Le Garage Hermétique de Jerry Cornelius, Le Garage Hermétique de Lewis Carnelian

In the Before Time, in the Long Long ago, in the late 1970s and 1980s, some movie execs decided it might be a good idea to make a few big-budget effects-heavy comic book movies. So we had two classic films based on DC Comics characters. The first was Richard Donner’s 1978 Superman, a hammy cheese-fest that nonetheless managed to charm the audience, largely via Gene Hackman’s movie-saving charisma and Christopher Reeve’s unshakable determination to play a ridiculous character as seriously as possible. On the other hand, the producers spent literally one-third of the $60 million budget to hire Marlon Brando in a cameo; and Margo Kidder gave a performance as Lois Lane that should have tipped off any competent psychiatrist that she was suffering from bipolar disorder and needed help.

The other was Tim Burton’s 1989 Batman, the first superhero film ever to capture the comic book fanboy’s love for the source material (in this case the uncredited Batman: The Dark Knight Returns by Frank Miller (1986), but that’s a fanboy rant for another blog post). Burton, following Miller’s lead, showed mainstream audiences that comic books can be dark, intellectual, weird, artistic and funny. And Jack Nicholson was a thespian ruminant, chewing the scenery and then chewing it again. Continue reading

Ten Famous Science Fiction Properties That Would Make Great VFX Movies — Part 3 ‘Appleseed’

This is a series of posts discussing ten existing science fiction properties (from literature, animation, games and comics) that could serve as the basis for ground-breaking live-action VFX films and television shows. This time: Shirow Masamune’s manga and anime franchise Appleseed.

For an explanation of the choices for this list, see the first entry.

Number 8 of 10: Appleseed (manga: 1985-89; anime: 1988, 2004, 2007)

If there’s one thing modern CG can render with absolute realism, it’s hardware. From modern consumer automobiles, commercial aircraft and military vehicles to futuristic robots, mecha and spacecraft, VFX artists have mastered the art of heavy gear, from 1984’s The Last Starfighter to last year’s Avatar.

But the military hardware, vehicles and spacecraft in modern VFX movies and television shows and video games do not show as much creative variety as one might expect, given the nearly boundless flexibility of CG. Spacecraft usually look much like the USS Sulaco from 1986’s Aliens, which itself isn’t terribly original. The “APUs” in Avatar are nearly identical to the battlemechs from the BattleTech franchise, themselves inspired by anime mecha. And any time you see a BFG (Big “Effin’” Gun) or any other large military prop in a sci-fi film, TV show or video game, it seems to come from the same prop house or 3D model library as all the others.

This isn’t necessarily because production designers and VFX artists are lazy or unoriginal – there are creative and production concerns. If a giant futuristic space blaster looks exactly like what the audience expects a giant futuristic space blaster to look like, a filmmaker need not waste time explaining what it is. The same goes for spaceships – film-goers unfamiliar with sci-fi (are there any of those left?) might be confused by the giant, spherical spaceship at the end of the 2008 remake of The Day the Earth Stood Still (they were already confused by the plot); but will instantly recognize the alien ship in 2009’s District 9, given its resemblance to the bastard love child of the giant saucers from Close Encounters of the Third Kind (1977) and Independence Day (1996). Continue reading

From ‘2001’ to ‘CSI’: Zoic Studios’ Rik Shorten on Motion Control for VFX

Originally published on I Design Your Eyes on 3/12/10.

motioncontrol_haze_630x354

In cinematography, motion control is the use of computerized automation to allow precise control of, and repetition of, camera movements. It is often used to facilitate visual effects photography.

I spoke with Rik Shorten, visual effects supervisor at Culver City, California’s Zoic Studios, about his use of motion control and how the technology has changed since it was introduced over three decades ago. Shorten produces motion-controlled effects for CBS’ visually-groundbreaking forensic drama, CSI: Crime Scene Investigation. He recently took home a VES Award for Outstanding Supporting Visual Effects in a Broadcast Program for his work on the “frozen moment” sequence in CSI’s tenth-season opener.

“I didn’t work on the original 2001: A Space Odyssey,” Shorten says, “or back in the Star Wars days in the 70s when computer-controlled cameras were first developed. But fundamentally, the way the technology works hasn’t changed. The rig I use almost weekly on CSI is the original rig from [David Lynch’s 1984 science-fiction film] Dune. The Kuper controller, the head that controls the rig, runs on MS-DOS. It’s a really old-school programming language that they used for these original systems, that hasn’t really changed because it hasn’t had to. It’s a coordinate-based system, XYZ, and we can write the moves we need. The way we do it today is the same way they would have programmed it 20 years ago. So from that perspective, the technology hasn’t increased.

The way the technology works hasn’t changed. The rig I use on CSI is the original rig from 1984’s Dune

“What has changed is that the rigs have gotten smaller, lighter and quieter. They now have the ability to run silently — they used to be so loud that you couldn’t record dialogue with them. Today we have smaller rigs that can fit through doorways – they’re made out of carbon fiber pieces now, they’re not the behemoths they used to be — and you can have actors delivering dialogue in the same scene as a motion-control move.

“So how we use them hasn’t changed, but slowly but surely they’ve made progress. Some weeks I use the old rigs versus the new ones, because that’s all I need, and the moves are simple enough. We have a system that’s 30 years old that we use side-by-side with a system built three or four years ago. And we interchange them based on the needs of the shot.

“As far as CSI, there are two ways we use motion control. We do stand-alone in-camera shots with the motion-control rig, where we’re flying over a prosthetic, or we’re traveling around a prop, and we need to get a macro shot; we use them a lot for macro photography. We have a couple of different snorkel lenses we use on the systems; one’s an endoscopic lens, and one is a probe lens, and they were both designed for medical photography. They’re both barrel lenses, about 12” long. The endoscopic lens is a fixed lens, sort of a wide-angle lens; it’s a tiny skinny little lens you can stick through a donut hole. If you see any shots where the camera goes in-between something where it seems like it shouldn’t go, that’s the lens we use.

“The probe lens is a little bit bigger, but it has multiple lens sizes, so we can go as wide as a 9mm or 12mm lens, for super-wide shots; right up to your prime lenses, your 22s, 25s, 30s, whatever it is. We use that in the same way, for getting into tight spots, and for getting macro, because the close-focus on these lenses is only about six inches. That’s a lot tighter than a normal lens can get.

“We use these cameras to get that sort of fantastical camera move that a Steadicam or a dolly couldn’t do. So when it’s got to rotate on three axes and fly in, that’s when we’ll program something in motion control. It’s like a 3D rendered camera, but we’re actually shooting it in real life. It frees us up to do more aggressive, creative moves.

motioncontrolcsi_630x354Rik Shorten with Director of Photography David Drzewiecki (center); with unidentified crew member and actress.

“The other way we use motion control is for multiple passes — like in the old days where they did three or four passes of the starship Enterprise with different lighting setups, and combined them all later. We don’t do much of that these days, at least in television; I’m sure they still do it in features. We use it for multiple layering. We’ll do the same scene with different elements in three or four passes, all broken apart with the same repeatable move; then we’ll put them all back together so we can affect the different elements in different ways.

“We do ghost shots every week. We’ll have a production plate without a foreground actor in it, just a background. We’ll track that plate here at Zoic. The data is then converted to XYZ coordinate data — ASCII files that MS-DOS can read off old-school 3½“ floppies — so the Kuper controller on the motion control rig can mimic the camera move from the track plate that we shot in first unit. When I put that data in, and I have my background plate and my video setup, I run them together and they’ll run at the same time. The camera will mimic what the first unit camera did.

“Let’s say a guy is firing a gun in front of the greenscreen, and he’s supposed to be a ghost image superimposed into the scene. I’ll shoot him on greenscreen, with that tracked camera move; and then when I come back here to Zoic, I’ve got a motion control pass on the greenscreen, and I’ve got my first unit plate, and the two line up perfectly. That’s how we get all the stylized transition pieces, and all those layers that CSI uses to great effect, because we have the capacity to translate and then to reshoot at a later date using the motion control system.

Recreating a scene on the greenscreen that was shot in the field is always a challenge…

“The first time I saw this used was in [1996’s] Multiplicity with Michael Keaton – that’s when I saw this tech first exploding, having the same person in the same scene, over and over. There were a lot of production cheats used for years, with locked plates and simple split-screen; but this technology allows you to travel 360 degrees around somebody, and go into the scene and come back out of the scene; and people can cross and interact and do other things, that they could not do without this system. If you’re using live action elements for these high-concept shots, then motion control is the only way to do it.”

Shorten says that precision is an important issue, just as it was with traditional locked-plate shots. “Sometimes we don’t have the exact lens, we’re off by a few mils. Say they used a 50mm lens and I only have a 45mm, sometimes there’s a little eye matching that needs to happen. To say it’s plug-and-play is disingenuous. You need to understand the limitations of the system.

“Recreating a scene on the greenscreen that was shot in the field is always a challenge. You need to expect there’s going to be some compensation; you’re going to have to do a little eye matching, playing shots back and doing an A-over-B in our video assist, and then adjusting your frame rates and composition, adjusting the speed of the moves. A lot of times, even with the track data, we’ll have to make some on-the-fly compensations to get things to sit in there correctly.

“We do surveys on location, as far as distances to camera and understanding where the actors are supposed to be in the greenscreen instance. How far away from the camera is the actor supposed to be in the scene, that’s where we start. When we transfer the data, we have a general idea that the camera’s six feet high, it’s five degrees tilted up, and 22 feet from our subject. But when you get it in the studio and do the A-over-B, you might realize that you need to be zero instead of five degrees, or you need to be four feet closer, or you need to change your lens a little bit. The elements have to line up visually, not just by the numbers, so they’re actually going to work when you look at the images together.”

Shorten says that some problems with matching can be fixed digitally. “There is a lot we can work with digitally. It’s not very often we will shoot something in motion control, come back here and have to throw it out completely. Usually it’s salvageable, even if we’re off for some reason.”

Shorten’s greatest challenge is in helping the television production community become comfortable with motion capture technology. “There is still a fear of using this technology, even though it’s been around for years, because it’s still considered to be the domain of feature films, and commercials and music videos that have more time and money than most productions believe they have.

motioncontrol_shorten_630x354Rik Shorten on the stage.

“There is an education process, that we’ve been quietly working on for a long time. ‘Motion control’ really is a four-letter word for a lot of production managers, who say ‘I don’t have the time, I don’t know the technology, I don’t know how to use it, I don’t know why I need it, and you’re going to kill my one-liner if I have to take five hours to set up a motion control shot. We just won’t do it.’ We run up against this all the time.

“And this is even on shows like CSI, which is comfortable with the technology. Every ninth day we have a motion control day, even if they are simple in-camera things. They understand it, but they will bounce shots. When I suggest taking my motion control off my second unit day, and putting it on set – as soon as I’m doing it with main unit actors, in the middle of the day when there’s 150 crew around, suddenly even shows that are comfortable with the technology get very nervous.

Definitely there’s a lot of apprehension, but it’s such a great technology…

“The hope is that with these smaller and quieter rigs, with the idea that we can do pre-viz and set surveys so that when we show up on set we know exactly where our rig is going to go, we can get in and get set up very quickly, and start rolling video takes to show a director within a couple of hours. We can have our pre-viz and our moves written, if we do our surveying correctly, so that we’re not starting from scratch. We don’t need a week to do a motion control move.

“Definitely there’s a lot of apprehension, but it’s such a great technology. These shots can’t be accomplished any other way, without costing too much. If you don’t shoot it this way, if you try to back into it later, you need all kinds of digital fixes and compromises. You spend money somewhere. Getting it in-camera, and doing as much as you can physically — for a lot of set-ups motion control is head-and-shoulders above any other technique, from a financial standpoint and for the way it’s going to come out for your show.

“It’s about trying to build that trust and that faith with productions. We’re not suggesting motion control because we want to noodle around with computer-controlled cameras; it’s because it really is the best way to achieve your shot, and get the elements we need to make something really dynamic for your show.”

More info: Zoic Studios Wins Big at 2010 VES Awards; Zoic Stops Time, Creates Historic ‘Frozen Moment’ Sequence for CBS’ ‘CSI’ Premiere.

Perfect “Harmony”: Zoic Creates VFX for Daytona 500 Coca-Cola NASCAR Spot

Originally published on I Design Your Eyes on 2/17/2010.

zoic_cokenascar_1_630x354

Eleven top NASCAR drivers are having a bad day, grumbling into their car radio mics. But once in the crew pit, each driver is offered a cold, refreshing bottle of Coca-Cola. Back on the track, the drivers are so exhilarated they begin singing “I’d Like to Buy the World a Coke,” as bewildered fans listen in over headphones.

The 60-second commercial, which also has two 30-second versions, premiered this last Sunday, Valentine’s Day, during the broadcast of the Daytona 500 on ESPN. It hearkens back to the 1971 commercial “Hilltop,” probably the most famous Coke commercial in history, which introduced the song. The new spot, entitled “Harmony,” features NASCAR drivers Greg Biffle, Clint Bowyer, Jeff Burton, Denny Hamlin, Kevin Harvick, Bobby Labonte, Joey Logano, Ryan Newman, David Ragan, Elliott Sadler and Tony Stewart.

See the “Harmony” spot here, at the end of a feature about the making of the commercial; the spot begins at 4:10.

The commercial does not appear to be effects-heavy, but appearances can be deceiving. It was assembled from a number of separate elements, including CG cars and digitally-altered stock footage. The VFX were created by Culver City, California’s Zoic Studios, which produces effects for commercials, feature films and episodic television, such as ABC’s V, FOX’s Fringe and CBS’ CSI: Crime Scene Investigation.

“The agency went to the NASCAR archives and pulled stock footage,” says Zoic executive producer, commercials Erik Press, “and they cut together what they envisioned as a race.

“Then they filled it in with close-ups of the actual drivers, which were shot on the racetrack in Charlotte, North Carolina. Those were inserted in the edit. [Commercial creative director] Les Ekker shot back plates for footage outside of the vehicles. Our task was to take stock footage, interiors of drivers, and plates of driving shots, and mix them all together and make them appear as one entire race.”

zoic_cokenascar_2_630x354

“Mostly the work consisted of taking their ‘hero’ celebrity drivers, and generating driving plates,” explains Neil Ingram, a Zoic producer.

“They wanted us to make these moments inside of the car to feel like ‘found’ footage, like you’re tapping into the live feed while they’re driving. Part of a NASCAR race is that you can rent headphones, and listen to the realtime exchanges of the drivers and the crews. The spectators that we cut away to are listening to the radios, and they’re bewildered by the fact that these drivers are all singing together.

“First we had to make the interior driving spots look realistic. Then we had to work on a degradation look, to make the shots match the practical realtime images that are actually from the cars; there are some of those shots in the spot.

“We had some CG augmentation on shots, and then ran it through compression. The cameras they use in the cars are ICONIX — they shoot back realtime images to a broadcast tower. They’re true HD cameras, but they get compressed with MPEG-2 compression. So we did some experimentation with different levels of MPEG and JPEG damage, to match the look. But these are celebrity drivers and these are product shots, so we had to find a balance between not getting too much degradation, but making them still feel ‘found.’”

“It was a fun job,” says Zoic co-founder Chris Jones, who was creative director for the VFX. “It has all the good elements for a visual effects spot: full-CG cars; full-CG dynamics; full-CG tracks; a lot of clean-up and footage matching; a lot of greenscreen; live-action plates; stock footage integration – it runs the whole range of VFX. It came together well – it’s a really satisfying piece. I’m pleased with it.”

Press says the production was a very positive experience for everyone involved. “It is really sort of an iconic Coca-Cola spot, with ‘I’d Like to Buy the World a Coke.’ They haven’t brought that theme back for some time.

“It was a really smooth production, it went really well. The agency was very happy. It was smooth for them as well — we were always right behind them, providing for them. A really positive experience.”

The spot was directed by Mike Long for Epoch Films; and edited by Matthew Hilbert of Joint Editorial House, Portland.

zoic_cokenascar_3_630x354

More info: “Coca-Cola Harmony – Behind The Scenes With The New Ad” on the Coca-Cola Conversations blog; Coca-Cola “Harmony” on Youtube; Coca-Cola “Hilltop” on Youtube.

Syd Dutton: Matte Painting from Traditional to Digital

Originally published on I Design Your Eyes on 12/4/09.

syddutton_630x354

It’s a cliché to call an artist “legendary,” but sometimes the word fits. Syd Dutton has been a leading matte painter for film and television for over three decades. His credits include Dune, The Running Man, Total Recall, the Addams Family films, Batman Forever, Star Trek: First Contact and Nemesis, U-571, The Fast and the Furious, The Time Machine, The Bourne Identity, and Serenity.

The Emmy-Award winner co-founded Illusion Arts in 1985, which created thousands of shots and matte paintings for over 200 feature films over 26 years. When Illusion Arts shut its doors earlier this year, Dutton and Zoic embraced the opportunity to collaborate, and Dutton became part of the team.

When I sat down to interview Dutton over coffee, it was with the intention of putting together some kind of grand post about the history of matte painting. But it’s far more interesting to let Syd speak for himself.

realgeniusb1_630x354

When I looked over your credits, I saw you worked on one of my favorite movies, Real Genius (1985).

That’s one of my favorites too. We did a practical matte for the B-1 bomber. [Val Kilmer and Gabe Jarret sneak onto a B-1 bomber on a military base.] In those days it was still paint on glass, and to get a sharp line for what was supposed to be the underbelly of the bomber, it had to be really sharp. But we were shooting at night. In order to black out the film, to do an original negative – you know what original negative work is, we needed two exposures — to get a real sharp line the matte had to be 50 feet out and 40 feet long. We spent several hours making it, putting cardboard where the belly had to go, making sure people would be underneath that line all the time. It was pretty fun.

But that was the coldest night of my entire career. I’ve been to some cold places, like Prague, but on that Van Nuys tarmac, that was the coldest ever.

dunegeidiprime_630x354

You also worked on David Lynch’s original Dune (1984). Can you tell me about the matte painting of the Harkonnen city on Giedi Prime?

Basically this was just a big painting. The people who are moving around were shot in a parking lot at Estudios Churubusco in Mexico City. Just one smoke and fire element was used over and over again. And then a car on a cable goes through the shot. The car was created by model maker Lynn Ledgerwood (The Bourne Identity), and measured about 8″ long.

The difficult thing about that was my partner and Al wanted to keep [film] as much as possible from being duplicated… we wanted to keep the painting on original film. So [we were] shooting the painting and making a very crude motion control frame-by-frame move; taking the film into an optical printer, trying to match the move through the optical printer; and then we put the people in and the smoke and the cable car. So we had to do a lot of adjustments, and we found that it had to be so exact, if we waited to shoot in the afternoon, the concrete floor had expanded. We had to shoot at a certain time in the morning, before the expansion occurred. So it was complicated, but we seemed to have lots of time in those days, and it was a fun painting to work on.

David Lynch would come by when I was painting it, and he would say “I like it, I want it dirtier.” He was always a nice guy, really a gentleman.

In your experience, what is the difference between working with traditional mattes versus digital?

There was a wonderful thing about doing original negative matte shots. You had to prepare the shot, and then you had to be committed to a matte line.

You had a whole bunch of test footage, and when the painting was completed you had to re-expose the same film, and hope that light bulbs didn’t burn out when you were shooting, or that the glass didn’t break. But it had a completeness to it, and so when you finished a matte shot, and when it came out like the ones in Real Genius — I thought they came out pretty well — there’s a great sense of completeness.

You made a long matte, you worked out the problems, you’ve been cold, you’ve endured that process, and you’ve gone through the photochemical process of developing the pieces of film, and working the matte line until it has disappeared. And finally you take a deep a breath and expose the two or three good takes that the director likes. And you put the worst one through first to make sure everything is working well, show it to the director, and then put the hero take through. Of course nobody had seen any footage unless they were shooting a B-cam, which they never did; and so it was kind of like the Catholic Church, where the director had to trust you that in two months or so you would have a finished product they would approve and like.

psycho3_630x354

Did you ever experience any disasters in that realm?

The only near disaster was when Tony Perkins — we were the first shot up on Psycho III (1986), that he was directing, and he was a real nice guy — and the first shot was this girl leaving a nunnery, and there was a piece of string that she had to follow so she would be on the [right side of the matte line].

And we had everything ready, the camera blocked off, and Tony Perkins came on the set. He came over to the camera, he looked through the lens and said “that’s perfect,” and then put all his body weight on the camera to lift himself up. And we said “we can roll in a few seconds!” I didn’t want to say “oh, you just [expletive deleted] up the shot!” We said, “oh, we need just a few more minutes of adjustment.” So we lied – we had to reset the matte, readjust the camera. If we had told him he had just screwed up the first shot of his movie, it was really bad luck. But that’s another shot that turned out well, I liked that shot a lot.

I can’t remember who the production designer was, I think it was Henry Bumstead [it was]. Everyone should know who Henry Bumstead was. He just died a while ago; he worked until 90, and died when he was 91 [in 2006]. He was Clint Eastwood’s favorite production designer, and in his 80s he designed Unforgiven — beautiful production design. Henry always made everything easy. Of course he had worked on Vertigo (1958).

According to IMDb, your first movie was Family Plot with Alfred Hitchcock?

Well, that was uncredited. I was hired by Albert Whitlock to work on The Hindenburg (1975) as a gopher, primarily, but then I came up with some ideas of my own, and Al liked them; so after Hindenburg Al made me his assistant. And Family Plot was again Henry Bumstead. Al really didn’t want to do the matte shot because he felt that it was – Hitchcock just wanted to show [actress] Karen Black what a matte shot was. It was a police station in San Francisco, a pretty easy matte shot; adding a second story, putting some what I would call “intelligent nonsense” in the background. So I painted that.

You had a fine art background?

Yeah. I went to Berkeley. Had a master’s degree. Had a wonderful time. Everything I learned, except for a sense of color, was totally useless when it came to matte painting. But it was still good to have that background. The best thing about going to Berkeley in those days was everyone wanted to be in San Francisco in the ‘60s. So I met people like Mark Rothko, pretty famous painters.

So what was it like to transition to digital, to have to train?

Oh, for me it was really hard. Rob Stromberg (2012, Avatar) was working for me at the time, and he embraced it really fast. I was just sort of afraid of it. I got used to it – it took me a while.

The people I know who were able to make the transition faster were people who like to draw things out. Bob Scifo (Fantastic Four: Rise of the Silver Surfer, The Abyss) for example is a wonderful matter painter. He has worked here for Zoic a couple of times. He came from the school where you drew everything out, and then painted it in. But he still got this incredible emotional result.

The way I learned to paint was the way Al Whitlock painted and Peter Ellenshaw (20,000 Leagues Under the Sea, The Black Hole) painted — you just started painting. Sometimes you didn’t know what you were going to paint, exactly; you knew what the subject matter was going to be — it might be a castle — but you just push paint around, and you start seeing things materialize – oh! I can see it now – and you let it dry, and try to bring all of it out of the fog. And that was a wonderful way to paint.

And in Photoshop, at least in the beginning, I couldn’t paint that way. I couldn’t make a big mess – it just stayed a big mess, I couldn’t refine it. The only way I could discover things and make a big mess was with Corel Painter; you can blend colors together and have accidents happen. And then at that point I usually finish the work in Photoshop.

When painting matte backgrounds now, you’re painting a painting, but there’s also the approach where you’re creating a 3D environment and making a 2D image from that.

Yeah. And there’s also projection – projecting a 2D painting onto objects. That’s another way to get camera movement. There’s no such thing anymore as a locked-down shot — that’s what matte paintings used to be. You would do everything in the world to make sure the camera didn’t move. And now people consider it a locked-off shot if they just hold the camera steady.

In the early days, you got to go out on location, sometimes to some really adventurous environments – a rock in the middle of some bay in Mexico; on a hillside in Europe somewhere. It was very physical, so you had that physical part. That part is now gone. Now I have to exercise to stay in shape, rather than just work. It was kind of dangerous, really – I didn’t think about it at the time.

There are no circumstances where they want you to go out and see the original location?

Not anymore. The visual effects supervisor will go to the locations, take photographs. He becomes the point man for every other department.

Does that feel like less involvement on your part?

Well, that’s the trade-off. The trade-off is that we can do now what we used to dream about doing. Which was, wouldn’t it be great if we could paint a grand, futuristic city and loop through it? Wouldn’t it be great if we could have a huge crowd running towards us in that shot? Rotoscope in a thousand people or something?

Things that we used to dream about, we can do now, but the trade-off is we don’t get to be as involved in the production as we once were. I talked to [Zoic co-founder] Loni [Peristere] about that. I said I feel bad for some of the kids here, that they’ll never be on the stage. It’s fun to be on location. He said the trade-off was they have all the tools to make their own movies. So, everything has a trade-off.

More info: Syd Dutton on IMDb.

Zoic Breathes Life Into Cartoon Network’s ‘Ben 10: Alien Swarm’

Originally published on I Design Your Eyes on 11/27/09.

Ben 10: Alien Swarm

This week, Cartoon Network premiered Ben 10: Alien Swarm, its second live-action movie based on the popular animated children’s series Ben 10: Alien Force. Alien Swarm is the sequel to the first live action film, Ben 10: Race Against Time; both were directed by Alex Winter (Freaked, Fever).

Alien Swarm continues the story of ten-year-old Ben Tennyson, an ordinary boy who becomes part of a secret organization called “the Plumbers,” which fights alien threats. He possesses a wristwatch-like device called the Omnitrix, which allows its wearer to take the physical form of various alien species. Ben, now a teenager and played by 23-year-old Ryan Kelley (Smallville), defies the Plumbers to help a mysterious childhood friend find her missing father.

Winter, an experienced director more familiar to fans as an actor from the Bill & Ted films and The Lost Boys, chose effects supervisor Evan Jacobs (Resident Evil: Extinction, Ed Wood) to oversee the movie’s many effects sequences. Jacobs worked with Culver City, California’s Zoic Studios to produce character animation and particle work for a number of key scenes.

Ben as Big Chill, using his freeze breath.
Ben as Big Chill, using his freeze breath.

Zoic worked on three main characters – Kevin “Kevin 11” Levin (Nathan Keyes, Mrs. Washington Goes to Smith), an alien-human hybrid who can absorb properties of matter; Ben’s cousin Gwen Tennyson (Galadriel Stineman, Junkyard Dog), another hybrid who manipulates energy; and Big Chill, one of Ben’s alien forms, a creature that breathes ice.

Zoic’s Executive Creative Director, Andrew Orloff (V, Fringe), says that for the production, the filmmakers chose to stay away from motion capture as “too limiting.” With all the jumping, flying and other stunt work that would be required, performers hanging from wires would not produce as realistic a result as traditional keyframing, in which every frame of a computer animation is directly modified or manipulated by the creator. “All the characters were traditionally keyframed and match moved by hand,” Orloff says.

Orloff collaborated with Winter and Jacobs to turn the Big Chill from the cartoon, an Necrofriggian from the planet Kylmyys, into a 3D, realistic breathing character. Working with a model created by Hollywood, California’s Super 78 Studios, Orloff developed character and motion & flying studies for Big Chill before the filmmakers ever hit the soundstage.

“It was very important to Alex [Winter] that we stay true to the original series, and give it a little something extra for the live action series that’s a real surprise for the viewers, to see their beloved cartoon characters finally brought to life,” Orloff says.

Gwen blasts the alien swarm, as Big Chill hovers nearby.
Gwen blasts the alien swarm, as Big Chill hovers nearby.

“Based on the visual choreography of the scenes, we didn’t really do previsualization as pre-development of the character. We talked about the way that [Big Chill] can fly, the maneuvers it could do; and that allowed Alex to have in his mind at the storyboard phase a good idea of what the kind of movement of the character was going to be.

“He’s a seven foot tall flying alien, so to create that realism was definitely a challenge. To take a two-dimensional character and turn it into a three-dimensional character, you have to maintain the integrity of the two-dimensional design, but make it look as if it’s realistically sitting in the environment. So we added a lot of skin detail, we added a lot of muscle detail and sinews; it was tricky to get the lighting of the skin exactly right. We just had to make sure that the skin had that ‘alien’ quality, so it didn’t look like a manikin or an action figure. We wanted to give a realistic feel to the skin using Maya/mental ray to render that subsurface scattering.”

Much of the footage with Big Chill involved the character flying and fighting inside a warehouse. It wasn’t possible to shoot plates that would track exactly with the as-yet unrendered character, and the filmmakers could only guess how the character would move, and how quickly. So Jacobs provided Zoic with a variety of plates of a number of different moves, plus some very high resolution 360° panoramas of the warehouse interior. Zoic then used these materials to produce its own plates, rebuilding the warehouse from the set photos and creating the shots needed to flesh out the sequence. This process was time-consuming and difficult, as much of the blocking and choreography was highly detailed.

In addition to designing the character’s movements and rendering his actions, Zoic created the freeze breath effects for Big Chill. The character’s power required two kinds of effects. First, Zoic used heavy-duty particle and fluid simulations in Maya and mental ray to create the chunks of ice, smoke and liquid nitrogen that blast from Big Chill’s mouth. Then Zoic produced quite a bit of matte painting work to encase objects in ice, icicles and frost. These include the chip swarm tornado; the interior of the warehouse; and the villain, Victor Validus (Herbert Siguenza, Mission Hill).

Kevin, having taken on the properties of the metal girder, attacks the alien swarm.
Kevin, having taken on the properties of the metal girder, attacks the alien swarm.

The main antagonist in Alien Swarm is the alien swarm itself, a cloud of thousands of intelligent, flying alien chips that work together to harm the good guys.

The alien swarm was also created in Maya and mental ray. According to Orloff, “there needed to be thousands of chips that swarmed with a random yet directed attack. The idea that the chips were learning, so they would group together – first they try to go at Gwen and the kids, and Gwen blasts them away — then they reconfigure into a buzz saw and try to attack the kids that way — then they configure into a large tornado – you have to give a personality, but an evolving personality, to a swarm of objects.” In predevelopment, Zoic looked at fish schooling and insect swarming behaviors in nature, to give the swarm movement that felt organic without seeming contrived.

Zoic also produced the effects for Kevin, who absorbs the properties of matter from objects. “Kevin was a big challenge,” Orloff says, “because what we ended up doing was scanning the actor; as he touched something we would put a CG version of the model over the top of him; rotoscope those few frames where the transition occurs; take that model and map it with whatever the material was – a rusty metal beam, a wood desk, a concrete floor. We rotoscoped the CG version over the top until the transformation was done, and then we transitioned from the rotoscoped animation, based on the actor’s performance, to a fully CG character animation.”

The energy manipulation effects for Gwen were “a ‘two-and-a-half-D’ effect, using 3D particle generators and 3D scene-tracked cameras in Adobe After Effects to create the energy bolts and energy fields that Gwen uses. We wanted to give it a ‘Jack Kirby’ kind of energy feel to it. So it has a lot of character to it, it looks very organic, and it affects the background objects and produces heat ripple effects.”

Frost effects in the warehouse. All of the frost and ice are VFX.
Frost effects in the warehouse. All of the frost and ice are VFX.

While Zoic was providing visual effects for the movie, Zoic’s Design Group worked directly with Vincent Aricco and Heather Reilly from Cartoon Network’s On-Air department, developing both the show and promo packaging for Ben 10: Alien Swarm. The package was used to promote the film both on-air and online – as well as in the Comic-Con preview this past summer.

Design Group Creative Director Derich Wittliff worked with Zoic’s internal production team, lead by Producer Scott Tinter and Designer Darrin Isono, creating 3D environments and models based on the movie’s 2D logo and other references from the film. Elements were created in Maxon Cinema 4D, Autodesk Maya and Adobe After Effects. The final product was a show open and modular promo toolkit which allowed Cartoon Network’s in-house team to create custom endpages, IDs, bumpers, and other elements.

Because the Zoic Design Group worked under the same roof as the team that produced effects for Alien Swarm, they had access to the best elements available from the show, like the “swarm” effect itself, as soon as they were created, allowing for an efficient process which produced finished elements for special uses – like Comic-Con – far in advance of customary production schedules.

Zoic Design Group Executive Producer Miles Dinsmoor says Zoic was excited to have the opportunity to work directly with Cartoon Network, acting as both a visual effects and digital production studio for the main production, and as a creative design shop for the promotional package, exploiting Zoic’s fully integrated media and design department. His goal is to offer Zoic’s in-house design and creative expertise industry-wide, and not just to Zoic’s existing VFX clients.

Orloff says he is proud of the work Zoic did on Ben 10: Alien Swarm, and looks forward to future collaboration with everyone involved – and hopefully, another Ben 10 movie.

More info: Ben 10: Alien Swarm at Cartoon Network; on Amazon.