Thursday, June 13, 2024
Subscribe Now

Voice Of The Crew - Since 2002

Los Angeles, California

HomeCraftsPostproductionPP-Post supervisor-Eric Brevig-The Island

PP-Post supervisor-Eric Brevig-The Island

-

Eric Brevig and director Michael Bay both have a penchant for what Brevig calls “photorealism”—making mayhem, whether historical or futuristic, look like it really happened, just that way. And further, in Brevig’s words, looking like it was filmed “with real photo equipment.”Whether he’s serving as both visual effects supervisor and second unit director, such as for Bay’s Pearl Harbor (which nabbed him an Oscar), or working on one of the Men In Black installments, or anything else in his long list of credits, Brevig likes to create shots to look as if a camera was placed somewhere, and really rolling film capturing whatever’s coming out of his computer in the form of digits.In Pearl Harbor, for example, the famous shot that tracked a Japanese bomb being released from a plane, and plummeting through the air until it hits its target, kept looking like a “cartoon”—framed and referenced from another bomb’s POV—until Brevig decided to structure the sequence as if it were being filmed by a parachutist who jumped the same time as the bomb was released, his camera shaking and being buffeted by the wind. Though, Brevig laughs, there remains the consideration of whether there was enough implied time for the parachutist to actually open his chute.But anchoring a shot in a theoretically real pair of hands helps Brevig “accentuate the stuff that’s bulletproof,” which in turn allows him to “get away with the stuff that’s dodgy,” and so the bomb sequence is now justified—at least according to the logic, the vocabulary, of film-watching. “Everyone who’s watched movies knows what the limits are,” he says.As second-unit director on Pearl Harbor, Brevig also hopped into a helicopter and did a flyover of the current Pearl Harbor, giving himself a “roadmap” he later used and filled in with digital recreations of exploding bombs and sinking aircraft carriers. But the aerial footage served as “background plates for everything in the shot,” helping to sell it as if there was chopper footage available during that “day of infamy.”Brevig found himself back in a copter when working on The Island, a futuristic clones-on-the-run thriller starring Ewan McGregor and Scarlett Johansson. Nor did Brevig get to fly over Hawaii or any other actual island this time, but instead, the streets of Los Angeles.And what was being visualized wasn’t the past, but a “futuristic” LA, replete with “a lot of aerial mass transit,” which includes overhead busses running on a wire system, monorails, and the like. And while any representation of Los Angeles with workable transit automatically casts a film as “fantasy,” Brevig once again wanted to make sure the shots in a particular chase sequence were as believable as possible, with flying near-future motorcycles “dodging (both) traffic on the streets and aerial obstacles.”Brevig had not only a helicopter, but a “camera car with a crane mounted to the roof” for The Island’s background plates, and used them to intercut with the blue-screen work. “A great combo,” he asserts, which, when the stunt work is rolled into the mix, makes use of just about “every technique you can come up with.”Indeed, blending those techniques is how Brevig sees his job, as “sort of the conductor. I don’t have to get involved with what instruments the musicians are using.” Instead, his task is “dissecting what doesn’t look right—that is something I have to worry about,” not whether a scene was built using RenderMan, or Maya, or what have you.Besides, Brevig adds, “most high-end programs have elements written at ILM, anyway,” his home base, so it’s always possible to rummage around a particular digital toolbox if he has to. Though he does single out ILM’s Zeno software, sometimes called “Zenviro” in that it makes digital “Zenvironments”—used in over 500 shots for Revenge of the Sith.The software, in Brevig’s description, gives “digital artists access to animating and lighting modules,” among many other things, allowing them more creative authorship when building shots or finishing worlds that may be initially realized on a single matte painting.But when an FX wrangler is fiddling around with lighting, will the DP be aggrieved? Is this more evidence of the collapsing boundaries of production and postproduction, to say nothing of more digitally caused muddling of on-set job descriptions?Brevig acknowledges that “the editors and I are having a polite session of stepping on each other’s toes,” as they tinker with some of the composited shots he sends them, and he, in turn, noodles around with edited sequences.But ultimately, Brevig considers such cross-pollination an “engaging way to be able to work with people, a great advantage,” to get past a “factory”-style set-up where someone is charged with one task only, and no need to think about the other widgets that go into what you’re making.Of course, given that Brevig was still working on the movie a scant three weeks before its release when Below the Line caught up with him—a result of what he terms an “extremely ambitious postproduction schedule” imposed from above, married to the fact that the film “didn’t necessarily have final script pages” done when production started—well, then cross-pollination becomes not only an aesthetic, but perhaps a necessity.And anyway, he adds, “if you’re good at what you’re doing, you don’t have to be threatened,” by all the collapsing boundaries in the digital sandbox. Indeed, given that “the creativity and decision making” still needed for film work can’t work at-faster-than-human speed, in spite of collapsed—or “ambitious”—post schedules, who can afford the umbrage?

Written by Mark London Williams

- Advertisment -

Popular

Vicon Introduces Mobile Mocap at SIGGRAPH

1
Motion capture systems developer Vicon is previewing a futuristic new “Mobile Mocap” technology at SIGGRAPH 2011 in Vancouver. Moving mocap out of the lab and into the field, Vicon's Mobile Mocap system taps several new technologies, many years in the making. At the heart of Mobile Mocap is a very small lipstick-sized camera that enables less obtrusive, more accurate facial animation data. The new cameras capture 720p (1280X720) footage at 60 frames per second. In addition, a powerful processing unit synchronizes, stores, and wirelessly transmits the data, all in a tiny wearable design.

Beowulf and 3-D