Ten Famous Science Fiction Properties That Would Make Great VFX Movies — Part 3 ‘Appleseed’

This is a series of posts discussing ten existing science fiction properties (from literature, animation, games and comics) that could serve as the basis for ground-breaking live-action VFX films and television shows. This time: Shirow Masamune’s manga and anime franchise Appleseed.

For an explanation of the choices for this list, see the first entry.

Number 8 of 10: Appleseed (manga: 1985-89; anime: 1988, 2004, 2007)

If there’s one thing modern CG can render with absolute realism, it’s hardware. From modern consumer automobiles, commercial aircraft and military vehicles to futuristic robots, mecha and spacecraft, VFX artists have mastered the art of heavy gear, from 1984’s The Last Starfighter to last year’s Avatar.

But the military hardware, vehicles and spacecraft in modern VFX movies and television shows and video games do not show as much creative variety as one might expect, given the nearly boundless flexibility of CG. Spacecraft usually look much like the USS Sulaco from 1986’s Aliens, which itself isn’t terribly original. The “APUs” in Avatar are nearly identical to the battlemechs from the BattleTech franchise, themselves inspired by anime mecha. And any time you see a BFG (Big “Effin’” Gun) or any other large military prop in a sci-fi film, TV show or video game, it seems to come from the same prop house or 3D model library as all the others.

This isn’t necessarily because production designers and VFX artists are lazy or unoriginal – there are creative and production concerns. If a giant futuristic space blaster looks exactly like what the audience expects a giant futuristic space blaster to look like, a filmmaker need not waste time explaining what it is. The same goes for spaceships – film-goers unfamiliar with sci-fi (are there any of those left?) might be confused by the giant, spherical spaceship at the end of the 2008 remake of The Day the Earth Stood Still (they were already confused by the plot); but will instantly recognize the alien ship in 2009’s District 9, given its resemblance to the bastard love child of the giant saucers from Close Encounters of the Third Kind (1977) and Independence Day (1996). Continue reading

Planes, Trains & Automatic Weapons: Zoic Provides Explosive VFX for FOX’s Human Target

Based loosely on the DC comic series of the same name, Human Target is an action-drama starring Mark Valley (Boston Legal) as security expert Christopher Chance, with Chi McBride (Boston Public) and Jackie Earle Haley (Watchmen). It airs Wednesdays at 8pm on FOX.

Zoic Studios provided a number of visual effects shots for the series, including for the pilot episode. Zoic creative director Andrew Orloff discusses the studio’s work on Human Target. Continue reading

3D TV, New Technology and the Future of Media

Originally published on IDesignYourEyes on 1/15/2010.

This is a moment of unparalleled change in the media world, part of a process of barely-controlled destruction and reconstruction that began over a decade ago. Business models and revenue streams are collapsing, and media creators are turning to the latest technologies to create new opportunities and new businesses. At this year’s Consumer Electronics Show in Las Vegas, technology firms touted a slate of new 3D TVs as a solution to video piracy, and a way to lure fickle consumers back away from free Internet content. But are such promises tenable?

It all started in the music industry, when Napster, the original digital music sharing service, was launched in June of 1999. With music freed from the baryonic prison of vinyl, polyester tape and polycarbonate plastic, consumers could copy, edit, sample, decode and redistribute it and other copyrighted content at will.

Rights holders had always controlled their intangible product by controlling the tangible media – records, cassette tapes and CDs, as well as radio frequencies, for music; television channels and chunky videotapes for video; multiple 40-pound reels of motion picture film for movies; floppy disks and CDs for software; plus dead-tree books and photographs. Suddenly, their control of intellectual property was just gone, vaporized in a mist of ones and zeroes. On one side, many music executives saw digital media as a tremendous new opportunity for both creative expression and for business. Zoic’s Jeff Suhy, a former record company executive, was quoted in the May 2000 Village Voice: “I love that the world is quite obviously changing before our eyes and no one really knows how it’s going to play out!”

Suddenly, control of intellectual property was gone, vaporized in a mist of ones and zeroes.

On the other hand, some rights holders saw any perceived change to their traditional revenue stream as a threat to be destroyed at all costs. They dug in their heels and fought the future – engendering numerous disasters, from Circuit City’s Digital Video Express, which sold consumers DVD movies that “expired” after two days, to the RIAA’s litigious pogrom against file-sharing college kids and soccer moms. And money spent to develop various copy-protection and DRM schemes was almost always wasted, as consumers found ways to defeat protection, or avoided protected products altogether.

But some in the business world saw opportunities, not enemies. When Steve Jobs first laid eyes on the Xerox Alto in the late 1970s, with its GUI user interface and mouse controller, he saw the future of computing. Decades later, Jobs understood that the original Napster, driven out of business by the record companies, was the template for media distribution in the new millennium. With Apple’s iTunes software and online store, Jobs went from computer mogul to media mogul, taking advantage of record companies’ desperation to gain control of digital music, and appointing to himself the power to single-handedly set prices for online entertainment. But iTunes by itself would not have been enough to compete with free MP3s – it was the convenience, portability, style, incredible ease of use, sound quality and price point of the iPod that gave Apple control first of the personal music player market, and then of legitimate online music and video distribution.

Now the media industry has reached another watershed moment of change, as file-sharing endangers the revenue models of film and television creators, as well as publishers and journalists. But media moguls have absorbed the lessons of the music industry’s tribulations in the last decade, and there is a new humility in the face of change — a willingness, even an eagerness, to adapt to the new digital world, rather than to deny it. In the last few years, movie and television creators have moved their product online, to free video sites like Hulu, which will soon experiment with for-pay models; and are offering high-definition, appointment-free content on demand to home televisions through cable companies and Netflix.

There is a new humility in the face of change — an eagerness to adapt to the new digital world.

In 2010, how else are media producers taking control of the future of their own industry? What are they doing to reimagine their businesses, and insure that the media world of 2020 is profitable and stable?

Some of the answers were on display at this year’s Consumer Electronics Show in Las Vegas. Publishers are betting that consumers will gladly pay to read their content on a new breed of flat, portable, easy-to-read e-book products. Just as the iPod saved music, publishers hope that Amazon’s Kindle and Barnes & Noble’s Nook will save literature and journalism, at least until true e-paper is developed.

The greatest buzz at CES was elicited by a whole crop of new HDTVs with 3D capabilities. The motion picture industry and the movie theater chains are increasingly turning to 3D and IMAX as ways to lure audiences into theaters, and the current success of James Cameron’s Avatar demonstrates that even in a serious global recession, moviegoers are willing to pay extra for a high-tech movie experience they can’t get at home.

The new 3D TVs, including the Panasonic TC-PVT25 series that won the Best of CES award this year, promise to provide an in-home 3D experience for only a few hundred dollars more than ordinary HDTVs. In addition, satellite television provider DirectTV announced at CES that it has teamed with Panasonic to create three HD 3D channels, to launch this spring. Working with media partners including NBC Universal and Fox Sports, DirectTV will offer a pay-per-view channel, an on-demand channel, and a free sampler channel, all in 24-hour 3D and compatible with the current generation of sets.

Like the original HD offerings in the mid-1990s, which focused on sports events and video from space missions, the new 3D channels will offer existing 3D movies, 3D upgrades of traditional 2D movies, and sports. Unlike with HDTV however, there is no indication the government will legislate widespread adoption of 3D TV. And there are issues.

3D will likely establish its foothold in the living room is not with sports or movies, but with video games.

The greatest usability issue is the need for viewers to wear glasses. While there are experimental technologies that work without glasses, today if you want to experience high-quality 3D television images you need to wear pricey shutter glasses. Unlike the polarized glasses patrons wear at theaters, shutter glasses respond to signals from the TV, directing alternating frames to alternating eyes. The glasses are expensive – only Panasonic is promising to provide a pair with your TV purchase, and additional pairs will run around $50. At least one manufacturer is already offering lighter, more fashionable, more expensive replacement glasses.

And wearing special glasses while watching TV at home is not conducive to the average person’s lifestyle. As Microsoft exec Aaron Greenberg told GameSpy at CES, “when I play games or watch TV, I’ve got my phone, I’ve got all kinds of things going on… I get up, I get down, I’m looking outside at the weather… I’m not in a dark theater, wearing glasses, staring at a screen.” You cannot walk around comfortably wearing modern shutter glasses, and just happen to be wearing them when you want to watch TV. Until 3D TVs don’t require glasses, consumers are going to have trouble integrating 3D television watching into their lives.

The new 3D TVs also suffer from varying levels of picture clarity and a pronounced flicker, although these issues are expected to disappear as the technology improves. More importantly, 3D media demand changes in how movies and television and produced. Right now, only computer animated films are expressly produced with the needs of 3D in mind, producing stunningly realistic depth-of-field and fine gradations of perceived depth. Film and video produced according to the traditional rules of 2D creates flat, paper-thin figures moving in a 3D environment that can appear shallow or truncated. Sports coverage, intended to be a killer app for 3D TV, particularly suffers from these issues, and 3D broadcasts of sporting events may require drastic changes to the technology used on the field.

Filmmakers are still learning how to deal with changing depth of focus. In the real world, the viewer chooses unconsciously where to focus their eyes; but in a 3D production this decision is made for the viewer. A plane of focus that appears to constantly shift can give audiences headaches and eye strain. A largely different language of cinema is being developed, to produce content in which 3D is a core component rather than a faddish trinket.

And finally, CNN Tech reports that between four and 10 percent of consumers suffer from something called “stereo blindness,” a sometimes treatable condition that makes it impossible to experience 3D movies or television. This is hardly a deal-killer, but one wonders how the spread of stereo music technology would have been affected if 10% of listeners had not been able to appreciate the difference.

Honestly, how 3D will likely establish its foothold in the living room is not with sports or movies, but with video games. Video gamers are already accustomed to buying expensive high-tech peripherals. They are used to content designed for one person, one screen. And when designed properly, 3D does not just add visual excitement to a game, but actually affects and enhances the gameplay itself.

So will 3D television lure viewers away from legitimate free Internet video, and from illegally pirated video files? It is too soon to tell. But there is a key difference to this strategy, as compared to some of the previously unsuccessful responses to piracy and the Internet. As with Steve Jobs and the iPod, 3D TV producers are offering consumers something new and exciting that, once the issues are worked out, will enhance their news and entertainment experiences. Rather than treating customers like the enemy, they are approaching customers as customers. And iTunes proves that people are more than willing to pay for their media, as long as they can experience a clear benefit.

More info: “Keeping Up With the Napsters” on Village Voice; “Why I can’t watch 3D TV” on CNN Tech; “DirectTV to launch the first 3D HDTV Channel” on Device; “Microsoft Exec Not Sold on 3D Home Gaming” on GameSpy; “3D TV? Too Soon Now, but One Day You Will Want It” on Singularity Hub; “I’m Sold On 3D TVs…And I Kind of Hate Myself For It” on Gizmodo.

Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century

Originally published by IDesignYourEyes on 1/7/2010.

zeus_v_greenscreen_630x354Actors Christopher Shyer and Morena Baccarin on the greenscreen set of ABC’s V; the virtual set is overlaid.

Visual effects professionals refer to the chain of processes and technologies used to produce an effects shot as a “pipeline,” a term borrowed both from traditional manufacturing and from computer architecture.

In the past year, Zoic Studios has developed a unique pipeline product called ZEUS. The showiest of ZEUS’ capabilities is to allow filmmakers on a greenscreen set to view the real-time rendered virtual set during shooting; but ZEUS does far more than that.

Zoic Studios pipeline supervisor Mike Romey explains that the pipeline that would become ZEUS was originally developed for the ABC science fiction series V. “We realized working on the pilot that we needed to create a huge number of virtual sets. [Read this for a discussion of the program’s VFX and its virtual sets.] That led us to try to find different components we could assemble and bind together, that could give us a pipeline that would let us successfully manage the volume of virtual set work we were doing for V. And, while ZEUS is a pipeline that was built to support virtual sets for V, it also fulfills the needs of our studio at large, for every aspect of production.

“One of its components is the Lightcraft virtual set tracking system, which itself is a pipeline of different components. These include InterSense motion tracking, incorporating various specialized NVIDIA graphics cards for I/O out, as well as custom inertial sensors for rotary data for the camera.

“Out of the box, we liked the Lightcraft product the most. We proceeded to build a pipeline around it that could support it.

“Our studio uses a program called Shotgun, a general-purpose database system geared for project shot management, and we were able to tailor it to support the virtual set tracking technology. By coming up with custom tools, we were able to take the on-set data, use Shotgun as a means to manage it, then lean on Shotgun to retrieve the data for custom tools throughout our pipeline. When an artist needed to set up or lay out a scene, we built tools to query Shotgun for the current plate, the current composite that was done on set, the current asset, and the current tracking data; and align them all to the timecode based on editorial selects. Shotgun was where the data was all stored, but we used Autodesk Maya as the conduit for the 3D data – we were then able to make custom tools that transport all the layout scenes from Maya to The Foundry’s Nuke compositing software.”

By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot.

Romey explains the rationale behind creating 3D scenes in Nuke. “When when you look at these episodic shows, there’s a large volume of shots that are close-up, and a smaller percentage of establishing shots; so we could use Nuke’s compositing application to actually do our 3D rendering. In Maya we would be rendering a traditional raytrace pipeline; but for Nuke we could render a scanline pipeline, which didn’t have same overhead. Also, this would give the compositing team immediate access to the tools they need to composite the shot faster, and it let them be responsible for a lot of the close up shots. Then our 3D team would be responsible for the establishing shots, which we knew didn’t have the quality constraints necessary for a scanline render.

“By offloading a lot of the 3D production onto 2D, we were able to cut the cost-per-shot, because we didn’t have to provide the 3D support necessary. That’s how the ZEUS pipeline evolved, with that premise – how do we meet our client’s costs and exceed their visual expectations, without breaking the bank? Throughout the ZEUS pipeline, with everything that we did, we tried to find methodologies that would shave off time, increase quality, and return a better product to the client.

“One of the avenues we R&Ded to cut costs was the I/O time. We found that we were doing many shots that required multiple plates. A new component we looked at was a product that had just been released, called Ki Pro from AJA.

“When I heard about this product, I immediately contacted AJA and explained our pipeline. We have a lot of on-set data – we the have tracking data being acquired, the greenscreen, a composite, and the potential for the key being acquired. The problem is when we went back to production, the I/O time associated with managing all the different plates became astronomical.

“Instead of running a Panasonic D5 deck to record the footage, we could use the Ki Pro, which is essentially a tapeless deck, on-set to record directly to Apple ProRes codecs. The units were cost effective – they were about $4,000 per unit – so we could set up multiple units on stage, and trigger them to record, sync and build plates that all were the exact same length, which directly corresponded to our tracking data.”

We found methodologies that would shave off time, increase quality, and return a better product to the client.

Previously, the timecode would be lost when Editorial made their selects, and would have to be reestablished. “That became a very problematic process, which would take human intervention to do — there was a lot of possibility for human error. By introducing multiple Ki Pros into the pipeline, we could record each plate, and take that back home, make sure the layout was working, and then wait for the editorial select.” The timecode from the set was preserved.

“The ZEUS pipeline is really about a relationship of image sequence to timecode. Any time that relationship is broken, or becomes more convoluted or complicated to reestablish, it introduces more human error. By relieving the process of human error, we’re able to control our costs. We can offer this pipeline to clients who need the Apple ProRes 442 codec, and at the end of the day we can take the line item of I/O time and costs, and dramatically reduce it.”

Another important component is Python, the general-purpose high-level programming language. “Our pipeline is growing faster than we can train people to use it. The reason we were able to build the ZEUS pipeline the way we have, and build it out within a month’s time, is because we opted to use tools like Python. It has given us the ability to quickly and iteratively develop tools that respond proactively to production.

“One case in point – when we first started working with the tracking data for V, we quickly realized it didn’t meet our needs. We were using open source formats such as COLLADA, which are XML scene files that stored the timecode. We needed custom tools to trim, refine and ingest the COLLADA data into our Shotgun database, into the Maya cameras, into the Nuke preferences and Nuke scenes. Python gave us the ability to do that. It’s the glue that binds our studio.

“While most components in our pipeline are interchangeable, I would argue that Python is the one component that is irreplaceable. The ability to iteratively making changes on the fly during an episode could not have been deployed and developed using other tools. It would not have been as successful, and I think it would have taken a larger development team. We don’t have a year to do production, like Avatar – we have weeks. And we don’t have a team of developers, we have one or two.

While most components in our pipeline are interchangeable, Python is the one component that is irreplaceable.

“We’re kind of new to the pipeline game. We’ve only been doing a large amount of pipeline development for two years. What we’ve done is taken some rigid steps, to carve out our pipeline such a way that when we build a tool, it can be shared across the studio.”

Romey expects great things from ZEUS in the future. “We’re currently working on an entire episodic season using ZEUS. We’re working out the kinks. From time to time there are little issues and hiccups, but that’s traditional for developing and growing a pipeline. What we’ve found is that our studio is tackling more advanced technical topics – we’re doing things like motion capture and HDR on-set tracking. We’re making sure that we have a consistent and precise road map of how everything applies in our pipeline.

“With ZEUS, we’ve come up with new ways that motion capture pipelines can work. In the future we’d like to be able to provide our clients with a way not only to be on set and see what the virtual set looks like, while the director is working — but what if the director could be on set with the virtual set, with the actor in the motion capture suit, and see the actual CG character, all in context, in real-time, on stage? Multiple characters! What if we had background characters that were all creatures, and foreground characters that were people, interacting? Quite honestly, given the technology of Lightcraft and our ability to do strong depth-of-field, we could do CG characters close-to-final on stage. I think that’s where we’d like the ZEUS pipeline to go in the future.

“Similar pipelines have been done for other productions. But in my experience, a lot of times they are one-off pipelines. ZEUS is not a pipeline just for one show; it’s a pipeline for our studio.

“It’s cost effective, and we think can get the price point to meet the needs of all our clients, including clients with smaller budgets, like webisodes. The idea of doing an Avatar-like production for a webisode is a stretch; but if we build our pipeline in such a way that we can support it, we can find new clients, and provide them with a better product.

“Our main goal with ZEUS was to find ways to make that kind of pipeline economical, to make it grow and mature. We’ve treated every single component in the pipeline as a dependency that can be interchanged if it doesn’t meet our needs, and we’re willing to do so until we get the results that we need.”

For more info: Lightcraft Technology; InterSense Inc.; Shotgun Software; AJA Video Systems; IDYE’s coverage of V.

The End of Rendering: Zoic Studios’ Aaron Sternlicht on Realtime Engines in VFX Production

Originally published on IDesignYourEyes on 1/6/2010.

Zoic created this Killzone 2 commercial spot entirely within the Killzone 2 engine.

The level of the technology available to produce computer graphics is approaching a new horizon, and video games are part of the equation.

Creators in 3D animation and visual effects are used to lengthy, hardware-intensive render times for the highest quality product. But increasingly, productions are turning to realtime rendering engines, inspired by the video games industry, to aid in on-set production and to create previz animations. Soon, even the final product will be rendered in realtime.

Aaron Sternlicht, Zoic Studios’ Executive Producer of Games, has been producing video game trailers, commercials, and cinematics since the turn of the millennium. He has charted the growth of realtime engines in 3D animation production, and is now part of Zoic’s effort to incorporate realtime into television VFX production, using the studio’s new ZEUS pipeline (read about ZEUS here).

Sternlicht explains how realtime engines are currently used at Zoic, and discusses the future of the technology.

“The majority of what we do for in-engine realtime rendering is for in-game cinematics and commercials. We can take a large amount of the heavy-lifting in CG production, and put it into a game engine. It allows for quick prototyping, and allows us to make rapid changes on-the-fly. We found that changing cameras, scenes, set-ups, even lighting can be a fraction of the workload that it is in traditional CG.

“Right now, you do give up some levels of quality, but when you’re doing something that’s stylized, cel-shaded, cartoonish, or that doesn’t need to be on a photo-realistic level, it’s a great tool and a cost effective one.

We’re going to be able to radically alter the cost structures of producing CG.

“Where we’re heading though, from a production standpoint, is being able to create a seamless production workflow, where you build the virtual set ahead of time; go to your greenscreen and motion capture shoot; and have realtime rendering of your characters, with lighting, within the virtual environment, shot by a professional DP, right there on-set. You can then send shots straight from the set to Editorial, and figure out exactly what you need to focus on for additional production — which can create incredible efficiencies.

“In relation to ZEUS, right now with [ABC’s sci-fi series] V, we’re able to composite greenscreen actors in realtime onto CG back plates that are coming straight out of the camera source. We’re getting all the camera and tracking data and compositing real-time, right there. Now if you combine that with CG characters that can be realtime, in-engine rendered, you then can have live action actors on greenscreen and CG characters fully lit, interacting and rendered all in realtime.

“People have been talking about realtime VFX for the last 15 years, but now it’s something you’re seeing actually happening. With V we have a really good opportunity. We’re providing realtime solutions in ways that haven’t been done before.

“Now there’s been a threshold to producing full CG episodic television. There has been a lot of interest in finding a solution to generate stylized and high quality CG that can be produced inexpensively, or at least efficiently. A process that allows someone to kick out 22 minutes of scripted full CG footage within a few weeks of production is very difficult to do right now, within budgetary realities.

“But with in-engine realtime productions, we can get a majority of our footage while we’re actually shooting the performance capture. This is where it gets really exciting, opening an entire new production workflow, and where I see the future of full CG productions.”

What game-based engines have Zoic used for realtime rendering?

“We’ve done a several productions using the Unreal 3 engine. We’ve done productions with the Killzone 2 engine as well. We’re testing out different proprietary systems, including StudioGPU’s MachStudio Pro, which is being created specifically with this type of work in mind.

“If you’re doing a car spot, you can come in here and say ‘okay, I want to see the new Dodge driving through the salt flats.’ We get your car model, transfer that to an engine, in an environment that’s lit and realtime rendered, within a day. We even hand you a camera, that a professional DP can actually shoot with on-site here, and you can produce final-quality footage within a couple of days. It’s pretty cool.”

How has the rise of realtime engines in professional production been influenced by the rise of amateur Machinima?

“I’ve been doing game trailers since 2000. I’ve been working with studios to design toolsets for in-game capture since then as well. What happened was, you had a mixture of the very apt and adept gamers who could go in and break code, or would use say the Unreal 2 engine, to create their own content. Very cool, very exciting.

“Concurrently, you had companies like Electronic Arts, and Epic, and other game studios and publishers increasing the value of their product by creating tool sets to let you capture and produce quality game play — marketing cameras that are spline-based, where you can adjust lighting and cameras on-the-fly. This provided a foundation of toolsets and production flow that has evolved into today’s in-engine solutions.”

It’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future.

How has this affected traditional producers of high-end software?

“It hasn’t really yet. There’s still a gap in quality. We can’t get the quality of a mental ray or RenderMan render out of a game engine right now.

“But the process is not just about realtime rendering, but also realtime workflow. For example, if we’re doing an Unreal 3 production, we may not be rendering in realtime. We’ll be using the engine to render, instead of 30 or 60 frames a second, we may render one frame every 25 seconds, because we’re using all the CPU power to render out that high-quality image. That said, the workflow is fully realtime, where we’re able to adjust lighting, shading, camera animation, tessellation, displacement maps — all realtime, in-engine, even though the final product may be rendering out at a non-realtime rate.

“Some of these engines, like Studio GPU, are rendering out passes. We actually get a frame-buffered pass system out of an engine, so we can do secondary composites.

“With the rise of GPU technology, it’s truly remarkable how the quality level is going up in realtime engines, and where it’s going to be in the future. Artists, rather than waiting on renders to figure out how their dynamic lighting is working, or how their subsurface scattering is working, will dial that in, in realtime, make adjustments, and never actually have to render to review. It’s really remarkable.”

So how many years until the new kids in VFX production don’t even know what “render time” means?

“I think we’re talking about the next five years. Obviously there will be issues of how far we can push this and push that; and we’re always going to come up with something that will add one more layer to the complexity of any given scene. That said, yes, we’re going to be able to radically alter the cost structures of producing CG, and very much allow it to be a much more artist-driven. I think in the next five years… It’s all going to change.”

Read Zoic Studios’ ZEUS: A VFX Pipeline for the 21st Century.

Ripomatics and Animatics: Storyboards for the 21st Century

Originally published on I Design Your Eyes on 12/11/09.

A screenshot of a “test” animatic produced by Zoic.

In the beginning was the storyboard, a series of illustrations displayed in sequence to pre-visualize a screenplay or teleplay, and to map out such elements as camera moves, blocking and effects. The modern storyboard was pioneered by one of the entertainment industry’s greatest innovators, Walt Disney, specifically for traditional cel animation. But the technique soon moved into feature film production, and later television, commercials, interactive media and video games — even web site design.

The next evolution in previsualization also came from animation. An animatic is a series of storyboard illustrations arranged on film or video, incorporating timing, simple movement, and sometimes dialogue and music. By making editing and story decisions at the animatic stage, animators can avoid the wasteful process of animating scenes that would eventually have been edited down or cut entirely.

More recently, ripomatics have evolved to help filmmakers design and express the look and feel of a project before any shooting or animating takes place. Originally developed in the commercial production industry, ripomatics are like animatics, but assembled from elements of previous films, television shows, and commercials; plus still images and other preexisting assets. A ripomatic for a television commercial might be composed entirely of clips from other commercials for similar products, combined with new music and messaging. They are often used to pitch projects to clients.

Zoic Studios is pioneering the next phase in storyboard evolution, offering a new kind of animated storyboard that lives halfway between existing animatics or ripomatics and a full 3D animated previsualization.

Zoic Studios compositor Levi Ahmu says “ripomatics were originally designed to make a moving storyboard. And when I got here [to Zoic], I thought it would be cool if we could enhance it a little bit.

bullet_630x354A screenshot of an animatic created by Zoic for a commercial,
for Guerrilla Games’ Killzone 2, entitled “Bullet.”

“The problem with storyboards and making them move [is] the storyboard is very flat. By cutting up the storyboard into layers, you can give 3D motion to it, which is what you’re eventually going to be doing anyway. It gives artists and clients a better sense of what’s going to happen. It also helps you time things out better; you have actual motion in the storyboards, so you can get a more relative frame count of what the product will be.”

But even these animatics gave only what Ahmu calls a “vague representation” of the final product. “So what we ended up doing was creating these 3D environments in a 2D setting. We’re taking 2D cards and arranging them so they’ll represent a room or a street or any kind of environment; then having a virtual camera move through that environment. You can take the 2D actors from the storyboard and put them in this environment; and the advantage of doing it this way is you’ll be able to have a [virtual] camera, with lens properties and animation curves that are more easily equated to what the 3D artists will wind up having to do.

“It’s all being done in Adobe After Effects, which is not at all what the software makers were intending. But the cool thing about doing it in After Effects is that you can put in particles, stuff you would never get in traditional previz, that enhance the experience. “

Some more elaborate ripomatics prepared by Zoic have included 3D vehicle models composed from 2D drawings; rough motion capture; and dialogue, sound effects and music.

Zoic executive producer Aaron Sternlicht, head of the studio’s Games Division, has supervised Ahmu in the production of a number of advanced ripomatics for a variety of clients over the last several years.

saboteur_630x354A screenshot of a ripomatic created for Pandemic Studios’ The Saboteur.

“It’s kind of like a 2½D ripomatic or animatic,” Sternlicht says. “We actually do all of our storyboards so that they’re laid out in layers, which actually allows us to get into production a lot more easily. We’re able to have an edit that is exciting, entertaining and really good to look at, for our clients to view within a few days, as opposed to having a rudimentary gray-shaded previz or just edited storyboards.

“The big reason we like working this way is that we’re able to have clients pretty much sign off on shot design, composition and pacing of camera work in 2D before we ever go to 3D. That allows us to be a lot more efficient once we go to 3D, and [to] give our artists a real clear path of what they’re supposed to be doing once we start building the scenes. So it’s a tremendous tool for us.

“Clients love it because they quickly get to see a massive leap from looking at storyboards to really understanding what the quality of the piece is going to be, the timing, and how exciting it might end up being. So we’re pretty psyched by the whole process.”

Ahmu agrees that clients are benefiting from the new technique. “As opposed to a traditional previz, which is all gray-shaded, and doesn’t have very much ambiance to it, a ripomatic the way we’ve been doing it can have stylized textures, rough animation, that will get the point across in such a way that it’s not like previz where it’s the first step. This is our goal, to have this motion, with these effects on top of it. You can get a rough idea of what the whole thing is supposed to be.”

falling1_630x354Another screenshot of a “test” animatic produced by Zoic.

Sternlicht is quick to point out that advanced ripomatics not only better represent the final product, but also save both Zoic Studios and its clients time and money. Even a complex animatic composed of multiple, animated elements can be produced in only a few days. And because the client is able to sign off on so many elements of the final product while still in the 2D stage, Zoic saves time and effort, and can pass that savings along to the client.

Zoic has applied the technique to video game and commercial projects, and plans to offer advanced ripomatics to its feature film and television clients where appropriate. “We have just had more opportunities for video games to implement it,” Sternlicht explains, “because we often are responsible for direction and creative.

“I think it’s already being used [in TV and feature work]. The technique we’re using is a little more advanced than what is commonly done. But we’re really pushing our ripomatics more towards motion comics, than necessarily your standard edited storyboard. So, full animation of characters, full animation of vehicles, full animation of camera, full animation of effects. It’s really kind of the whole package.

“It’s part of our service. It’s part of working with Zoic and being creative.”

Zoic Breathes Life Into Cartoon Network’s ‘Ben 10: Alien Swarm’

Originally published on I Design Your Eyes on 11/27/09.

Ben 10: Alien Swarm

This week, Cartoon Network premiered Ben 10: Alien Swarm, its second live-action movie based on the popular animated children’s series Ben 10: Alien Force. Alien Swarm is the sequel to the first live action film, Ben 10: Race Against Time; both were directed by Alex Winter (Freaked, Fever).

Alien Swarm continues the story of ten-year-old Ben Tennyson, an ordinary boy who becomes part of a secret organization called “the Plumbers,” which fights alien threats. He possesses a wristwatch-like device called the Omnitrix, which allows its wearer to take the physical form of various alien species. Ben, now a teenager and played by 23-year-old Ryan Kelley (Smallville), defies the Plumbers to help a mysterious childhood friend find her missing father.

Winter, an experienced director more familiar to fans as an actor from the Bill & Ted films and The Lost Boys, chose effects supervisor Evan Jacobs (Resident Evil: Extinction, Ed Wood) to oversee the movie’s many effects sequences. Jacobs worked with Culver City, California’s Zoic Studios to produce character animation and particle work for a number of key scenes.

Ben as Big Chill, using his freeze breath.
Ben as Big Chill, using his freeze breath.

Zoic worked on three main characters – Kevin “Kevin 11” Levin (Nathan Keyes, Mrs. Washington Goes to Smith), an alien-human hybrid who can absorb properties of matter; Ben’s cousin Gwen Tennyson (Galadriel Stineman, Junkyard Dog), another hybrid who manipulates energy; and Big Chill, one of Ben’s alien forms, a creature that breathes ice.

Zoic’s Executive Creative Director, Andrew Orloff (V, Fringe), says that for the production, the filmmakers chose to stay away from motion capture as “too limiting.” With all the jumping, flying and other stunt work that would be required, performers hanging from wires would not produce as realistic a result as traditional keyframing, in which every frame of a computer animation is directly modified or manipulated by the creator. “All the characters were traditionally keyframed and match moved by hand,” Orloff says.

Orloff collaborated with Winter and Jacobs to turn the Big Chill from the cartoon, an Necrofriggian from the planet Kylmyys, into a 3D, realistic breathing character. Working with a model created by Hollywood, California’s Super 78 Studios, Orloff developed character and motion & flying studies for Big Chill before the filmmakers ever hit the soundstage.

“It was very important to Alex [Winter] that we stay true to the original series, and give it a little something extra for the live action series that’s a real surprise for the viewers, to see their beloved cartoon characters finally brought to life,” Orloff says.

Gwen blasts the alien swarm, as Big Chill hovers nearby.
Gwen blasts the alien swarm, as Big Chill hovers nearby.

“Based on the visual choreography of the scenes, we didn’t really do previsualization as pre-development of the character. We talked about the way that [Big Chill] can fly, the maneuvers it could do; and that allowed Alex to have in his mind at the storyboard phase a good idea of what the kind of movement of the character was going to be.

“He’s a seven foot tall flying alien, so to create that realism was definitely a challenge. To take a two-dimensional character and turn it into a three-dimensional character, you have to maintain the integrity of the two-dimensional design, but make it look as if it’s realistically sitting in the environment. So we added a lot of skin detail, we added a lot of muscle detail and sinews; it was tricky to get the lighting of the skin exactly right. We just had to make sure that the skin had that ‘alien’ quality, so it didn’t look like a manikin or an action figure. We wanted to give a realistic feel to the skin using Maya/mental ray to render that subsurface scattering.”

Much of the footage with Big Chill involved the character flying and fighting inside a warehouse. It wasn’t possible to shoot plates that would track exactly with the as-yet unrendered character, and the filmmakers could only guess how the character would move, and how quickly. So Jacobs provided Zoic with a variety of plates of a number of different moves, plus some very high resolution 360° panoramas of the warehouse interior. Zoic then used these materials to produce its own plates, rebuilding the warehouse from the set photos and creating the shots needed to flesh out the sequence. This process was time-consuming and difficult, as much of the blocking and choreography was highly detailed.

In addition to designing the character’s movements and rendering his actions, Zoic created the freeze breath effects for Big Chill. The character’s power required two kinds of effects. First, Zoic used heavy-duty particle and fluid simulations in Maya and mental ray to create the chunks of ice, smoke and liquid nitrogen that blast from Big Chill’s mouth. Then Zoic produced quite a bit of matte painting work to encase objects in ice, icicles and frost. These include the chip swarm tornado; the interior of the warehouse; and the villain, Victor Validus (Herbert Siguenza, Mission Hill).

Kevin, having taken on the properties of the metal girder, attacks the alien swarm.
Kevin, having taken on the properties of the metal girder, attacks the alien swarm.

The main antagonist in Alien Swarm is the alien swarm itself, a cloud of thousands of intelligent, flying alien chips that work together to harm the good guys.

The alien swarm was also created in Maya and mental ray. According to Orloff, “there needed to be thousands of chips that swarmed with a random yet directed attack. The idea that the chips were learning, so they would group together – first they try to go at Gwen and the kids, and Gwen blasts them away — then they reconfigure into a buzz saw and try to attack the kids that way — then they configure into a large tornado – you have to give a personality, but an evolving personality, to a swarm of objects.” In predevelopment, Zoic looked at fish schooling and insect swarming behaviors in nature, to give the swarm movement that felt organic without seeming contrived.

Zoic also produced the effects for Kevin, who absorbs the properties of matter from objects. “Kevin was a big challenge,” Orloff says, “because what we ended up doing was scanning the actor; as he touched something we would put a CG version of the model over the top of him; rotoscope those few frames where the transition occurs; take that model and map it with whatever the material was – a rusty metal beam, a wood desk, a concrete floor. We rotoscoped the CG version over the top until the transformation was done, and then we transitioned from the rotoscoped animation, based on the actor’s performance, to a fully CG character animation.”

The energy manipulation effects for Gwen were “a ‘two-and-a-half-D’ effect, using 3D particle generators and 3D scene-tracked cameras in Adobe After Effects to create the energy bolts and energy fields that Gwen uses. We wanted to give it a ‘Jack Kirby’ kind of energy feel to it. So it has a lot of character to it, it looks very organic, and it affects the background objects and produces heat ripple effects.”

Frost effects in the warehouse. All of the frost and ice are VFX.
Frost effects in the warehouse. All of the frost and ice are VFX.

While Zoic was providing visual effects for the movie, Zoic’s Design Group worked directly with Vincent Aricco and Heather Reilly from Cartoon Network’s On-Air department, developing both the show and promo packaging for Ben 10: Alien Swarm. The package was used to promote the film both on-air and online – as well as in the Comic-Con preview this past summer.

Design Group Creative Director Derich Wittliff worked with Zoic’s internal production team, lead by Producer Scott Tinter and Designer Darrin Isono, creating 3D environments and models based on the movie’s 2D logo and other references from the film. Elements were created in Maxon Cinema 4D, Autodesk Maya and Adobe After Effects. The final product was a show open and modular promo toolkit which allowed Cartoon Network’s in-house team to create custom endpages, IDs, bumpers, and other elements.

Because the Zoic Design Group worked under the same roof as the team that produced effects for Alien Swarm, they had access to the best elements available from the show, like the “swarm” effect itself, as soon as they were created, allowing for an efficient process which produced finished elements for special uses – like Comic-Con – far in advance of customary production schedules.

Zoic Design Group Executive Producer Miles Dinsmoor says Zoic was excited to have the opportunity to work directly with Cartoon Network, acting as both a visual effects and digital production studio for the main production, and as a creative design shop for the promotional package, exploiting Zoic’s fully integrated media and design department. His goal is to offer Zoic’s in-house design and creative expertise industry-wide, and not just to Zoic’s existing VFX clients.

Orloff says he is proud of the work Zoic did on Ben 10: Alien Swarm, and looks forward to future collaboration with everyone involved – and hopefully, another Ben 10 movie.

More info: Ben 10: Alien Swarm at Cartoon Network; on Amazon.