Saturday, May 25, 2024
Subscribe Now

Voice Of The Crew - Since 2002

Los Angeles, California

HomeCraftsPostproductionPP-Scott Stokdyk/Superman

PP-Scott Stokdyk/Superman

-

by Mark London WilliamsViewers, actors, set and costume designers, directors—all of them know the line between live action and animation is collapsing at breakneck speed. Even author Salman Rushdie wondered if a movie like Perfect Storm—real actors wayfaring on CG seas—should be technically classified as a cartoon.That same boundary-blurring is occurring in the world of special effects. Not in terms of defining an effect itself, but rather where the once-familiar distinctions of production and postproduction leave off, and begin.Much of effects work—aside from stunts and makeup applications—was once the province of the postproduction phase: think blue screens, matte work, and the model of King Kong being moved frame by meticulous frame. Now, according to Scott Stokdyk, the visual effects supervisor on Spider-Man 2 (a job he reprised from the first Spider-Man film), “our job begins during the script-creating process.”Not in the Hollywood sense so worrisome to writers—that is, rooms full of unsolicited “helping hands,” often recruited by the producer to “make it better”—but rather so everyone involved in telling Spider-Man’s story could understand both what was possible and what was desired.Stokdyk explains how his immediate boss, the legendary visual effects producer John Dykstra, “was pitching ideas for the script,” letting the storytellers know, for example, how the arms of Doc Ock—the villain played by Alfred Molina—would move, and what the “rationale behind them” would be.One of the first things Stokdyk himself does is to break down that same script, and then, on a practical side, “my supervisors and I start to determine how much everything costs.” At that time, the effects team starts to get preliminary ideas on how to create the called-for visuals, and where their work picks up from the “functional” shots.Now Stokdyk and his team have moved from the preproduction phase straight into production itself, coordinating with the various on-set stunt people, the “guys who do explosives,” and anything else that might be recreated live, for the camera itself.Digital technology has progressed, however, to the point where there can actually be some give-and-take. “Our input is in terms of how hard things are to do, and what, visually, will look the best,” says Stokdyk. Surprisingly, in spite of the early script consultations, “a lot of things get made up on the day of the shoot.” For example, “they may try to shoot a stunt and be unable to do it,” and wonder if the image could be done in CG instead. Stokdyk’s team cheerfully tries to “give any sort of creative input we can.”Another example of such input would be advising Molina on body language he could use if he’s acting in a scene where his sentient tentacles will be added later.But the tentacles provide a handy metaphor, as Stokdyk asserts that one of the biggest changes in the special effects world—contrasted with the first Spider-Man film, or even earlier work on films like Stuart Little, where “much of the design and concept had to be planned well in advance”—is the increased amount of “wiggle room” for creating or refining images on the fly when the actual postproduction phase comes around.“In the first Spider-Man, we learned if we can control a virtual environment and a virtual character,” there’d be “incredible freedom in post.” And given that most freedom has a price, the bill for this kind was relatively cheap—especially if you weren’t one of the actors. For it was incumbent on Tobey Maguire and Molina to head on down to USC’s Institute of Creative Technology, and sit still—really, really still—while going through an MRI-like procedure that mapped their heads.While not lasting as long as an MRI, the “Light Stage” process uses computer-driven cameras, humming along at 60 fps, while synced to strobe lights, to capture hundreds of images of the actors’ faces and heads, in every conceivable lighting situation (such as bright light on the face, shadowy light under the chin). All of this is then translated into digits and used in “virtual acting” situations later.This would also have its effect on the post process: “If Sam [Raimi, director] and his editors needed a shot,” Stokdyk says, the effects team could “create a new shot from scratch” to use as a bridging shot. Compared to the days when Steven Spielberg, say, would have to violate union rules and bootleg pick-up shots for Jaws in Hollywood swimming pools, things have come a long away.But Stokdyk is very clear that while technology allows virtual landscapes to stand in for real ones, and computed camera moves to go beyond what cranes and Steadicams can do, it still “takes a village”—to borrow a phrase from the political arena—to make ones and zeroes come to life. And Stokdyk is quite up front about crediting the team he worked with.Among them were John Manos, who worked with John Wallace on making Doc Ock’s virtual tentacles spring to life; Jeff Stern, who worked specifically on the virtual “Tobey model”—which is to say, the Webslinger himself; and Dan Abrams, who worked on the 3D modeling, creating “a bigger library of buildings” over and around which that virtual Spidey could travel.That team—and the many more whom space prevents us from naming individually—all participate in “decisions,” as Stokdyk describes them, that if “not made 100-percent correct will give you a clue”—a look—that the shot is fake. And the decisions are as small as knowing how much green bounces off Doc Ock’s coat onto his skin, or how much fat an actor’s cheek might have in it, determining how much shine a reflected light might have.All of this contributes to what Stokdyk calls “a fluid process” between production and post, which also included cinematographer Bill Pope coming in to consult, to better help match the rendered shots with the ones that were actually, well, shot.“A lot of the creative process is being pushed to the editing stage,” Stokdyk agrees—that phase of filmmaking most traditionally associated with “post.” “We have to be responsive to that in visual effects.”

Written by Mark London Williams

Previous article
Next article
- Advertisment -

Popular

Lord of the Rings: The Rings of Power VFX Supervisor Jason...

0
Last week, Below the Line shared an interview with Production Designer Ramsey Avery, who snagged the coveted role of designing Amazon's Lord of the...

Beowulf and 3-D