Ten Famous Science Fiction Properties That Would Make Great VFX Movies — Part 4 ‘The Airtight Garage’


This is a series of posts discussing ten existing science fiction properties (from literature, animation, games and comics) that could serve as the basis for ground-breaking live-action VFX films and television shows. This time: Jean “Moebius” Giraud’s 1976 graphic novel
The Airtight Garage.

For an explanation of the choices for this list, see the first entry.

Number 7 of 10: The Airtight Garage (US title, comic, 1976), aka Le Garage Hermétique de Jerry Cornelius, Le Garage Hermétique de Lewis Carnelian

In the Before Time, in the Long Long ago, in the late 1970s and 1980s, some movie execs decided it might be a good idea to make a few big-budget effects-heavy comic book movies. So we had two classic films based on DC Comics characters. The first was Richard Donner’s 1978 Superman, a hammy cheese-fest that nonetheless managed to charm the audience, largely via Gene Hackman’s movie-saving charisma and Christopher Reeve’s unshakable determination to play a ridiculous character as seriously as possible. On the other hand, the producers spent literally one-third of the $60 million budget to hire Marlon Brando in a cameo; and Margo Kidder gave a performance as Lois Lane that should have tipped off any competent psychiatrist that she was suffering from bipolar disorder and needed help.

The other was Tim Burton’s 1989 Batman, the first superhero film ever to capture the comic book fanboy’s love for the source material (in this case the uncredited Batman: The Dark Knight Returns by Frank Miller (1986), but that’s a fanboy rant for another blog post). Burton, following Miller’s lead, showed mainstream audiences that comic books can be dark, intellectual, weird, artistic and funny. And Jack Nicholson was a thespian ruminant, chewing the scenery and then chewing it again. Continue reading

Ten Famous Science Fiction Properties That Would Make Great VFX Movies — Part 3 ‘Appleseed’

This is a series of posts discussing ten existing science fiction properties (from literature, animation, games and comics) that could serve as the basis for ground-breaking live-action VFX films and television shows. This time: Shirow Masamune’s manga and anime franchise Appleseed.

For an explanation of the choices for this list, see the first entry.

Number 8 of 10: Appleseed (manga: 1985-89; anime: 1988, 2004, 2007)

If there’s one thing modern CG can render with absolute realism, it’s hardware. From modern consumer automobiles, commercial aircraft and military vehicles to futuristic robots, mecha and spacecraft, VFX artists have mastered the art of heavy gear, from 1984’s The Last Starfighter to last year’s Avatar.

But the military hardware, vehicles and spacecraft in modern VFX movies and television shows and video games do not show as much creative variety as one might expect, given the nearly boundless flexibility of CG. Spacecraft usually look much like the USS Sulaco from 1986’s Aliens, which itself isn’t terribly original. The “APUs” in Avatar are nearly identical to the battlemechs from the BattleTech franchise, themselves inspired by anime mecha. And any time you see a BFG (Big “Effin’” Gun) or any other large military prop in a sci-fi film, TV show or video game, it seems to come from the same prop house or 3D model library as all the others.

This isn’t necessarily because production designers and VFX artists are lazy or unoriginal – there are creative and production concerns. If a giant futuristic space blaster looks exactly like what the audience expects a giant futuristic space blaster to look like, a filmmaker need not waste time explaining what it is. The same goes for spaceships – film-goers unfamiliar with sci-fi (are there any of those left?) might be confused by the giant, spherical spaceship at the end of the 2008 remake of The Day the Earth Stood Still (they were already confused by the plot); but will instantly recognize the alien ship in 2009’s District 9, given its resemblance to the bastard love child of the giant saucers from Close Encounters of the Third Kind (1977) and Independence Day (1996). Continue reading

Ten Famous Science Fiction Properties That Would Make Great VFX Movies — Part 2 ‘Erma Felna EDF’

This is a series of posts discussing ten existing science fiction properties (from literature, animation, games and comics) that could serve as the basis for ground-breaking live-action VFX films and television shows. This time: the furry animal sci-fi comic Erma Felna EDF.

For an explanation of the choices for this list, see the first entry.

Number 9 of 10: Erma Felna EDF (comic, 1983-2005)

It took a few decades, but computer graphics engineers have mastered the modeling and rendering of hair and fur. This has allowed a tremendous level of sophistication in CG animals that are realistic (the giant ape in 2005’s King Kong), cartoonish (the new CG Chipmunks films), and somewhere in-between (Aslan the Lion from the Chronicles of Narnia adaptations).

But little has yet been done in the realm of anthropomorphics, what is sometimes referred to as “funny animal” or “furry” animation and comics. These are usually representations of characters with animal heads and other bestial characteristics, but humanoid (“anthropomorphic”) bodies, intelligence and the ability to speak. Such furry characters may or may not wear clothes; may live in their own “furry” world, or in the real world with humans; and may have their own animal-based culture. Such creatures appear in children’s literature (Beatrice Potter’s 1902 The Tale of Peter Rabbit; Kenneth Grahame’s 1908 The Wind in the Willows) and in adult stories (Art Spiegelman’s Maus: A Survivor’s Tale (1980-91); Kirsten Bakis’ 1997 Lives of the Monster Dogs). Continue reading

Ten Famous Science Fiction Properties That Would Make Great VFX Movies — Part 1 ‘Wings of Honneamise’

This is a series of posts discussing ten existing science fiction properties (from literature, animation, games and comics) that could serve as the basis for ground-breaking live-action VFX films and television shows. First up: the 1987 anime feature film The Wings of Honnêamise.

In the 1980s and 90s, effects-centered films and television shows occupied specific niches. In film, an effects-heavy movie like Ghostbusters or Terminator 2: Judgment Day was a summer tentpole release designed to reel in teen audiences of repeat viewers; while a show like Star Trek: The Next Generation, with its $2.5 million an episode budget, was a risky experiment in capitalizing on 1960s nostalgia.

Today, most movies rely heavily on VFX, many of those effects invisible. Greenscreen sets and set extensions, digital makeup, and post-production fixes for on-set mistakes are just a few applications of digital technology used in films and TV shows that the average viewer might think had no effects whatsoever.

But audiences still want “effects-heavy” films, from The Matrix and The Lord of the Rings trilogies at the turn of the millennium to the Iron Man films and Avatar today. And for the first time in TV history, shows from Firefly and Battlestar Galactica to V and Human Target are recreating the experience of effects-heavy, action-oriented movies on the small screen.

Two factors have led to this renaissance in effects-driven entertainment. First, technological advances have made it cheaper and cheaper to create top-quality effects. And second, those same advances have made it possible to realistically render visions that were never possible before. Today’s VFX artists can create worlds that just ten years ago producers would have said could only be represented with traditional animation. Rumor said James Cameron abandoned his Spider-Man film project because he was dissatisfied with the realism of the character’s CG web-slinging. Can you imagine the director of Avatar having such a concern today? Continue reading

Zoic Studios Works with Tiger Woods, Wieden+Kennedy to Create Moving “Earl and Tiger” Spot

See the “Earl and Tiger” spot on Nike Golf’s official YouTube channel, or on the Zoic Studios site.

There is no need at this late date for a simple VFX blog to recount the recent events in the professional and private lives of golfer Tiger Woods. But as Woods attempts to resume his career, both as an athlete and as a corporate spokesperson, he and sponsor Nike, along with ad agency Wieden+Kennedy, production company Pretty Bird and Zoic Studios, have created a moving, personal and risky 30-second commercial spot. Zoic has worked with Wieden+Kennedy on several spots of late, particularly the ESPN NASCAR campaign out of New York, and Coke NASCAR and now Nike out of Portland. Continue reading

Planes, Trains & Automatic Weapons: Zoic Provides Explosive VFX for FOX’s Human Target

Based loosely on the DC comic series of the same name, Human Target is an action-drama starring Mark Valley (Boston Legal) as security expert Christopher Chance, with Chi McBride (Boston Public) and Jackie Earle Haley (Watchmen). It airs Wednesdays at 8pm on FOX.

Zoic Studios provided a number of visual effects shots for the series, including for the pilot episode. Zoic creative director Andrew Orloff discusses the studio’s work on Human Target. Continue reading

‘FlashForward’ Flashback: Zoic Studios’ Steve Meyer on the Award-Nominated VFX for the Pilot

Posted by Erik Even in I Design Your Eyes on April 1, 2010

Property of ABC; screencap from the Zoic Television Reel.

Based on the science fiction novel by Robert J. Sawyer, ABC’s FlashForward tells the story of the aftermath of a bizarre global event. For 137 seconds, every person on Earth (except perhaps one) loses consciousness, and experiences visions of their own future.

The pilot episode presents the immediate aftermath of the worldwide disaster, with the consequences of the worldwide blackout – millions of deaths due to traffic collisions, crashed aircraft, and other accidents. Star Joseph Fiennes, portraying FBI agent Mark Benford, survives an auto wreck and looks out over a chaotic Los Angeles cityscape. Culver City, California’s Zoic Studios was tapped to create the disastrous tableau; the company’s work on the episode was nominated for two VES awards, for Outstanding Supporting Visual Effects in a Broadcast Program and for Outstanding Created Environment in a Broadcast Program or Commercial. Continue reading

Zoic’s Syd Dutton on Mentoring in the Visual Effects Industry

Originally posted on I Design Your Eyes on3/25/10.

mentoring_630x354

It’s easy for today’s young filmmakers to forget that the art of the cinema goes back 132 years; television 83 years; and interactive media 23 years. Today’s students might think the latest high tech tools are all they need to succeed in the rapidly-changing visual effects industry; and they’ll be sorely disappointed when their ignorance of time-tested filmmaking technique puts them in the dole queue.

That’s why mentoring is so important to the future success of young VFX professionals. I recently sat down with Zoic Studios’ Syd Dutton to discuss the importance of industry pros passing along their knowledge to the next generation.

Dutton has been a leading matte painter for film and television for over three decades. His credits include Dune, Total Recall, the Addams Family films, Star Trek: First Contact and Nemesis, U-571, The Fast and the Furious, The Bourne Identity, and Serenity. The Emmy-Award winner co-founded Illusion Arts in 1985, which created thousands of shots and matte paintings for over 200 feature films over 26 years.

As we spoke, Dutton’s longtime collaborator and Zoic compositing supervisor Fumi Mashimo listened in, and occasionally interjected. Mashimo’s credits include From Hell, Van Helsing and Public Enemies. Continue reading

From ‘2001’ to ‘CSI’: Zoic Studios’ Rik Shorten on Motion Control for VFX

Originally published on I Design Your Eyes on 3/12/10.

motioncontrol_haze_630x354

In cinematography, motion control is the use of computerized automation to allow precise control of, and repetition of, camera movements. It is often used to facilitate visual effects photography.

I spoke with Rik Shorten, visual effects supervisor at Culver City, California’s Zoic Studios, about his use of motion control and how the technology has changed since it was introduced over three decades ago. Shorten produces motion-controlled effects for CBS’ visually-groundbreaking forensic drama, CSI: Crime Scene Investigation. He recently took home a VES Award for Outstanding Supporting Visual Effects in a Broadcast Program for his work on the “frozen moment” sequence in CSI’s tenth-season opener.

“I didn’t work on the original 2001: A Space Odyssey,” Shorten says, “or back in the Star Wars days in the 70s when computer-controlled cameras were first developed. But fundamentally, the way the technology works hasn’t changed. The rig I use almost weekly on CSI is the original rig from [David Lynch’s 1984 science-fiction film] Dune. The Kuper controller, the head that controls the rig, runs on MS-DOS. It’s a really old-school programming language that they used for these original systems, that hasn’t really changed because it hasn’t had to. It’s a coordinate-based system, XYZ, and we can write the moves we need. The way we do it today is the same way they would have programmed it 20 years ago. So from that perspective, the technology hasn’t increased.

The way the technology works hasn’t changed. The rig I use on CSI is the original rig from 1984’s Dune

“What has changed is that the rigs have gotten smaller, lighter and quieter. They now have the ability to run silently — they used to be so loud that you couldn’t record dialogue with them. Today we have smaller rigs that can fit through doorways – they’re made out of carbon fiber pieces now, they’re not the behemoths they used to be — and you can have actors delivering dialogue in the same scene as a motion-control move.

“So how we use them hasn’t changed, but slowly but surely they’ve made progress. Some weeks I use the old rigs versus the new ones, because that’s all I need, and the moves are simple enough. We have a system that’s 30 years old that we use side-by-side with a system built three or four years ago. And we interchange them based on the needs of the shot.

“As far as CSI, there are two ways we use motion control. We do stand-alone in-camera shots with the motion-control rig, where we’re flying over a prosthetic, or we’re traveling around a prop, and we need to get a macro shot; we use them a lot for macro photography. We have a couple of different snorkel lenses we use on the systems; one’s an endoscopic lens, and one is a probe lens, and they were both designed for medical photography. They’re both barrel lenses, about 12” long. The endoscopic lens is a fixed lens, sort of a wide-angle lens; it’s a tiny skinny little lens you can stick through a donut hole. If you see any shots where the camera goes in-between something where it seems like it shouldn’t go, that’s the lens we use.

“The probe lens is a little bit bigger, but it has multiple lens sizes, so we can go as wide as a 9mm or 12mm lens, for super-wide shots; right up to your prime lenses, your 22s, 25s, 30s, whatever it is. We use that in the same way, for getting into tight spots, and for getting macro, because the close-focus on these lenses is only about six inches. That’s a lot tighter than a normal lens can get.

“We use these cameras to get that sort of fantastical camera move that a Steadicam or a dolly couldn’t do. So when it’s got to rotate on three axes and fly in, that’s when we’ll program something in motion control. It’s like a 3D rendered camera, but we’re actually shooting it in real life. It frees us up to do more aggressive, creative moves.

motioncontrolcsi_630x354Rik Shorten with Director of Photography David Drzewiecki (center); with unidentified crew member and actress.

“The other way we use motion control is for multiple passes — like in the old days where they did three or four passes of the starship Enterprise with different lighting setups, and combined them all later. We don’t do much of that these days, at least in television; I’m sure they still do it in features. We use it for multiple layering. We’ll do the same scene with different elements in three or four passes, all broken apart with the same repeatable move; then we’ll put them all back together so we can affect the different elements in different ways.

“We do ghost shots every week. We’ll have a production plate without a foreground actor in it, just a background. We’ll track that plate here at Zoic. The data is then converted to XYZ coordinate data — ASCII files that MS-DOS can read off old-school 3½“ floppies — so the Kuper controller on the motion control rig can mimic the camera move from the track plate that we shot in first unit. When I put that data in, and I have my background plate and my video setup, I run them together and they’ll run at the same time. The camera will mimic what the first unit camera did.

“Let’s say a guy is firing a gun in front of the greenscreen, and he’s supposed to be a ghost image superimposed into the scene. I’ll shoot him on greenscreen, with that tracked camera move; and then when I come back here to Zoic, I’ve got a motion control pass on the greenscreen, and I’ve got my first unit plate, and the two line up perfectly. That’s how we get all the stylized transition pieces, and all those layers that CSI uses to great effect, because we have the capacity to translate and then to reshoot at a later date using the motion control system.

Recreating a scene on the greenscreen that was shot in the field is always a challenge…

“The first time I saw this used was in [1996’s] Multiplicity with Michael Keaton – that’s when I saw this tech first exploding, having the same person in the same scene, over and over. There were a lot of production cheats used for years, with locked plates and simple split-screen; but this technology allows you to travel 360 degrees around somebody, and go into the scene and come back out of the scene; and people can cross and interact and do other things, that they could not do without this system. If you’re using live action elements for these high-concept shots, then motion control is the only way to do it.”

Shorten says that precision is an important issue, just as it was with traditional locked-plate shots. “Sometimes we don’t have the exact lens, we’re off by a few mils. Say they used a 50mm lens and I only have a 45mm, sometimes there’s a little eye matching that needs to happen. To say it’s plug-and-play is disingenuous. You need to understand the limitations of the system.

“Recreating a scene on the greenscreen that was shot in the field is always a challenge. You need to expect there’s going to be some compensation; you’re going to have to do a little eye matching, playing shots back and doing an A-over-B in our video assist, and then adjusting your frame rates and composition, adjusting the speed of the moves. A lot of times, even with the track data, we’ll have to make some on-the-fly compensations to get things to sit in there correctly.

“We do surveys on location, as far as distances to camera and understanding where the actors are supposed to be in the greenscreen instance. How far away from the camera is the actor supposed to be in the scene, that’s where we start. When we transfer the data, we have a general idea that the camera’s six feet high, it’s five degrees tilted up, and 22 feet from our subject. But when you get it in the studio and do the A-over-B, you might realize that you need to be zero instead of five degrees, or you need to be four feet closer, or you need to change your lens a little bit. The elements have to line up visually, not just by the numbers, so they’re actually going to work when you look at the images together.”

Shorten says that some problems with matching can be fixed digitally. “There is a lot we can work with digitally. It’s not very often we will shoot something in motion control, come back here and have to throw it out completely. Usually it’s salvageable, even if we’re off for some reason.”

Shorten’s greatest challenge is in helping the television production community become comfortable with motion capture technology. “There is still a fear of using this technology, even though it’s been around for years, because it’s still considered to be the domain of feature films, and commercials and music videos that have more time and money than most productions believe they have.

motioncontrol_shorten_630x354Rik Shorten on the stage.

“There is an education process, that we’ve been quietly working on for a long time. ‘Motion control’ really is a four-letter word for a lot of production managers, who say ‘I don’t have the time, I don’t know the technology, I don’t know how to use it, I don’t know why I need it, and you’re going to kill my one-liner if I have to take five hours to set up a motion control shot. We just won’t do it.’ We run up against this all the time.

“And this is even on shows like CSI, which is comfortable with the technology. Every ninth day we have a motion control day, even if they are simple in-camera things. They understand it, but they will bounce shots. When I suggest taking my motion control off my second unit day, and putting it on set – as soon as I’m doing it with main unit actors, in the middle of the day when there’s 150 crew around, suddenly even shows that are comfortable with the technology get very nervous.

Definitely there’s a lot of apprehension, but it’s such a great technology…

“The hope is that with these smaller and quieter rigs, with the idea that we can do pre-viz and set surveys so that when we show up on set we know exactly where our rig is going to go, we can get in and get set up very quickly, and start rolling video takes to show a director within a couple of hours. We can have our pre-viz and our moves written, if we do our surveying correctly, so that we’re not starting from scratch. We don’t need a week to do a motion control move.

“Definitely there’s a lot of apprehension, but it’s such a great technology. These shots can’t be accomplished any other way, without costing too much. If you don’t shoot it this way, if you try to back into it later, you need all kinds of digital fixes and compromises. You spend money somewhere. Getting it in-camera, and doing as much as you can physically — for a lot of set-ups motion control is head-and-shoulders above any other technique, from a financial standpoint and for the way it’s going to come out for your show.

“It’s about trying to build that trust and that faith with productions. We’re not suggesting motion control because we want to noodle around with computer-controlled cameras; it’s because it really is the best way to achieve your shot, and get the elements we need to make something really dynamic for your show.”

More info: Zoic Studios Wins Big at 2010 VES Awards; Zoic Stops Time, Creates Historic ‘Frozen Moment’ Sequence for CBS’ ‘CSI’ Premiere.

Zoic Studios Blows Up ‘The Crazies’ Fan Premiere

Originally posted on I Design Your Eyes on 2/26/2010.

eisnerolyphantmitchell_630x354From left: director Breck Eisner and stars Timothy Olyphant and Radha Mitchell take the coming nuclear onslaught quite seriously.

On Wednesday evening, Overture Films held a special fan premier event for its new horror film, The Crazies, which opens today. The movie is a remake of the 1973 George A. Romero classic, and stars Timothy Olyphant (Deadwood, Live Free or Die Hard), Radha Mitchell (Surrogates) and Joe Anderson (Amelia). It tells the story of a small Iowa town devastated by an unknown toxin that causes insanity and death.

Invited guests, who included fandom journalists and horror bloggers, were treated to an immersive experience from the moment they pulled up in their cars. The KCET public television studios in Hollywood were transformed for the evening into beleaguered Ogden Marsh, Iowa. As guests arrived, they were pulled from their cars by military personnel, and marched past army vehicles and through metal detectors to a medical examination area. Nearby, citizens were assaulted, cuffed and herded into pens by soldiers, while moaning bodies lay on gurneys or were stacked in body bags. After guests were checked for contamination, they were issued wristbands indicating whether they were infected or clear, and then herded onto school buses with blackened windows. Military instructions were blared over loudspeakers while the sound of helicopters was heard overhead.

zoic_crazies_3950The after-party on KCET studios’ Stage B.

After being driven around for a while, the guests were released in front of a movie theater, issued rations (popcorn), and taken inside to watch the film.

Afterward, guests were invited back to the KCET lot (despite the bus ride, it was just across the street) to enjoy dinner, music and an open bar, and to hobnob with director Breck Eisner and stars Olyphant and Mitchell. There were also demonstrations from various companies that worked on the film. Guests could watch a stunt show and see a stunt performer set on fire; be turned into Crazies by professional makeup artists; or be strapped into a harness and “hanged” by the neck.

zoic_crazies_3942Eisner, Olyphant and Mitchell pose at the Zoic booth.

Culver City, California’s Zoic Studios, which provided visual effects for the film, offered a VFX “before-and-after” reel; and Zoic co-founder Loni Peristere was on hand to answer fans’ questions, along with compositing supervisor Aaron Brown, who flew down from Zoic’s Vancouver, British Columbia studio just for the occasion. Also, guests were invited to pose in front of a greenscreen and get professionally composited into a still shot of a nuclear explosion from the film. A team of Zoic compositors created over 100 images over the course of the evening, which were emailed to fans.

The evening was an incredible success, and fans had to be kicked out when the bar shut down at 12:30am. For more information about The Crazies, visit the official web site.

View all the images from the event below; or follow this link to the Flickr page.