Fry the Cheerleader and Other Imperatives of the "Heroes" VFX Team
Pages: 1, 2

Tokyo Rising

"Toward the end of season one, we depicted the nuclear annihilation of New York City, so we thought we had some precedent for this scene of Tokyo ripped in two," says Mark. And though the Heroes VFX team is often able to concoct "cocktails," or recipes, for executing certain digital effects over and over, this was not one of those times. "For starters, this was a ground-level view of the destruction, the city literally being ripped in two, like a giant earthquake coming down the middle of the street, toward camera, destroying everything in its path—cars flying, people fleeing—a very different perspective from the long shots of the New York skyline incinerating." Everyone was in agreement that the scene had to be big, dramatic, and as realistic as possible—on a tight budget and schedule.

Example
These screens show the nuclear annihilation of virtual New York in the background...
Example
...and the composite with live-action actor in the foreground. Images courtesy of NBC Universal and Stargate Digital.

"We thought about starting with plates of a practical location, like Little Tokyo in Los Angeles," says Eric, "but quickly realized that, in order to render the scene effectively, our Tokyo would have to be CG, so we set about building a virtual landscape in Maya." Since Tokyo doesn't have a singular distinguishing landmark like the Eiffel Tower in Paris or the Golden Gate bridge in San Francisco, the team settled on the city's notorious neon signs (nearly a lead character in Lost in Translation), as signifiers. "We picked an actual street in Tokyo to recreate, and used photographic references from books to build a basic 3D architecture," adds Mark, "then found similar buildings in Los Angeles and photographed them as models for textures" (paint color, cracks, etc). Stargate's 3D department populated the place with cars and other detritus, while the show's art department created non-copyrighted, original signage.



The live action plate, with characters "Hiro," "Ando" and about fifty extras, was shot against a green screen. Green screen refers to a process where foreground subjects (in this case, Hiro, Ando, and the extras) are photographed in front of an evenly lit, bright, pure blue (or green) background, later completely replaced with a digital image known as the background plate, which is composited (or digitally blended) in post-production using a software program like After Effects. As Eric notes: "The reason green and blue are used is that these are the only two hues not found in human skin tones." Sometimes, if a character's costume is green, the background will change to blue, and vice versa.

Example Example Example Example
The VFX team considered shooting a practical plate in Los Angeles' Little Tokyo, but decided to create a CG city, first by pre-visualizing, then shooting Hiro and Ando against a green screen, then compositing the actual actors and virtual destruction. Image courtesy of NBC Universal and Stargate Digital.

Mark credits the costume department with accurately outfitting all the extras, which the VFX team then duplicated into fleeing CG extras who manage to interact realistically with the environment as it's being destroyed. This scene was made possible by Massive, a software package that creates thousands—even millions—of "agents" (digital extras) who act as individuals through the use of fuzzy logic (programming that uses approximate rather than precise reasoning). "As the street erupts and all the buildings begin to crumble," says Eric, "we also had to think about how the electrical grid would respond in real life, and that, in turn, affected our virtual lighting of the scene as blackouts roll through the city."

As the split rips up the pavement, cars fall into the chasm. One car gets hurled right at Hiro. And, of course, buildings collapse. "We studied footage of buildings collapsing during earthquakes and we all saw the Twin Towers tragically fall on 9/11," says Mark. Interestingly, one element that was difficult to get right was the dust kicked up in the wake of a building's collapse. "We know what real dust looks like, but we needed lots of trial and error, many computer simulations of various particle systems, plus varying degrees of wind movement and gravity, in order to render it realistically."

Highslide JS
This sequence started with the live-action plate of actor William Katt, then transitioned to the plate with matte painting of ice and frost, and finally to an entirely CG digital double. Clip courtesy of NBC Universal and Stargate Digital.(Click to play. Movie will begin playing once loaded.)

Right-click to download this video clip.

(Particle systems are a method used in 3D computer graphics to simulate certain environmental effects such as rain, explosions, smoke, or dust. A particle system has its own rules that it applies to every particle, which often involve interpolating values over the lifetime of a particle. A particle system is made up of particles, or small objects, with properties such as position, velocity, and color.)

Despite all the simulations, though, the element wasn't quite right. According to Mark: "Usually, we're augmenting the practical with the virtual, but in this case, we had to augment our digital dust with stock footage of the real thing." The resulting dust composite was, well, magical. All the elements came together for a stunning sequence on which the team spent 100 man-days, and which took weeks to render on a dedicated render farm, even with optimization (by not rendering the non-camera facing polygons).

Asset Management: The SQL

Both Mark and Eric credit Stargate Film's founder, Sam Nicholson, for having the vision to develop not only the studio and sound stage but also a proprietary digital pipeline and content management system that lets the VFX team deliver motion picture quality, Emmy®-nominated effects on episodic television budgets and schedules. "It's such a cool system," Mark raves. "When an artist finishes a shot, a copy of it pops up on my desktop. I can see, right away, whether the green screen is right, or the color balance or tracking are off." The same shot pops up on Eric's screen, and he can read Mark's notes in real-time and offer his own feedback.

Stargate's Adam Ealovega says the CMS evolved organically, as assets became digitized, along with a need to organize them online. "We started with a Filemaker database, which required lots of human input," Adam remembers. Of course, wherever there are humans, error is sure to follow. "When we were at a quiet stage in the project pipeline, files got updated and put in the right folders," Adam says. But, as critical deadlines approached and workflow sped up, updating the database became less of a priority. "When artists went to retrieve assets, file information was missing or out-of-date," Adam recalls, "eroding trust in the system." In creating Stargate's ever-evolving proprietary asset management system (using Microsoft® SQL), Adam and the Stargate technology team's objective was to make everything as automated as possible

For instance, file naming conventions have been automated. The first iteration is "name1," the second iteration "name2," and so forth. "It used to be, an artist might think he was working on the final iteration, and gave it a '.final' extension," says Adam, "which was okay until another artist had a new version of the same file and named it '.final.final,'" he chuckles. "I just thought, what's next? '.noireallymeanitthistime'?" In addition to consistent naming conventions, the asset management system hooks into primary applications like Maya and After Effects so that when an artist accesses a file, all the updating of the database is done behind the scenes. Most impressively, once a visual effects shot is returned from the render farm, a copy pops up on the desktop of every person who has touched it for review. According to Adam: "We estimate our system has saved countless man hours that, for us, are better spent focusing on the work."

The company has also innovated a new green screen technology called VBLive (for Virtual Backlot Live), a tool that lets filmmakers replace green screens with a digital projection background plate of any environment—stock footage or 3D—that can be instantly composited and recorded (through a digital camera) in real time. "We're having great success with this system in daytime drama," Mark reports, "helping shows create high-quality effects like car crashes and tornadoes in record time. We just completed 380 effects shots for All My Children." And that is downright, well, heroic.