Saturday, May 18, 2024
Subscribe Now

Voice Of The Crew - Since 2002

Los Angeles, California

HomeAwardsVisual Effects Contenders

Visual Effects Contenders


Not surprisingly, this year’s F/X bake-off is shaping up to be a showdown: between The Lord of the Rings: The Two Towers (the first installment, The Fellowship of the Ring, won last year’s contest) and Star Wars: Episode II – Attack of the Clones. Other eligible films for the visual-effects nominations include Harry Potter and the Chamber of Secrets, Men in Black II, Minority Report, Spider-Man and xXx.

Members of the visual-effects nominating committee will view 15-minute clip reels from each movie Feb. 5 and will pick three nominees, which will be announced on Feb. 11. The Oscars will be awarded March 23.

 This year’s lineup includes technological breakthroughs in skin and cloth simulation, more dazzling water and fire sequences, digital characters and an ever-present use of digital doubles.

Rings Director Peter Jackson and his New Zealand-based WETA Digital Effects are back, with a dazzling sequel that includes more shots of the fiery Balrog monster, legions of Orc, and a host of new, incredibly-challenging digital feats.

The Two Towers boasts the extraordinary hybrid-character acting performance of Andy Serkis as Gollum/Smeagol (a mutated hobbit), who interacted with live actors and also worked with the motion-capture system to create the most buzz of any single element in the movie – followed by Treebeard, a likeable talking tree.

Another sequel – offering up some new and improved effects of its own – is Warner Bros.’ Harry Potter and the Chamber of Secrets.

As in Rings, Potter features its own CG-animated character, Dobby, the house elf. Industrial Light & Magic worked on 250 of the film’s 950 effects shots, concentrating on Dobby and improving upon the first film’s Quidditch match, creating a more lifelike, faster-paced game. Using new tools that make it quicker and easier to transfer motion-capture data, visual effects supervisor Bill George took the match to a new level.

“One of my personal goals was that I wanted to feel that [the Quidditch match] was a game,” says George, who did research watching DVDs of football, basketball and baseball games to figure out the choreography. “In all of those games, the ball is the center of attention, and that’s what I wanted to show: all the characters chasing the ball.”  One big change for the match in this film was that the actors were shot wild, rather than steady and stable, giving the scene a realistic quality. The scene also included heavy use of digital doubles.

For both Dobby and the Quidditch game, cloth simulation figured prominently, including a first-time creation of double-sided fabric for the capes of the Quidditch players.

“All the cloth in the Quidditch match is fabricated to make it look realistic and make the fabrics move right, including the canvas covering of the towers,” says George. “We set up 1,000-frame simulations, had them done for each shot – except for one, where the bludger pops through the fabric of the tower; we did a simulation specifically for that.” 

George and his team used cloth-simulation technology that was developed for Episode II – Attack of the Clones. ILM completed all of the film’s 2,000 shots, with an extra 200 that wound up on the DVD version.

F/X supervisors Pablo Helman, John Knoll, Dennis Muren, Ben Snow and animation supervisor Rob Coleman worked to deliver a digital Yoda, among other completely computer-generated characters, sets and sequences.

One breakthrough for the teams, from production to post, was shooting the film in HD. One of the first films to be shot this way, HD aided in the speed of putting the film together, as well as aiding creative control of everything from editing to color-timing, says Helman. “By being involved in color timing we have a say in how the images are color-corrected and we can say what we meant; there’s a great range of control.”

Many of the film’s sequences were incredibly complicated, layering hundreds of effects, including the fight between Obi-Wan Kenobi (Ewan McGregor) and Jango Fett (Temuera Morrison) on the rain planet Kamino, which included digital environments and digital doubles. During the sequence where Yoda fights Count Dooku (Christopher Lee), not only is Yoda completely digital, but in many of the shots actor Lee is as well.

“You may ask why so many films have digital doubles,” says Helman, who supervised 800 shots and 72 minutes of the film, including the Yoda fight, sequences on the rain planet, Tatooine and Naboo. “Doing it for the sake of doing it is not a good enough reason. … I personally think of using digital doubles when a performer can’t do something because it’s dangerous, or a camera needs to be in a place it can’t possibly be, or there’s a scheduling problem and you can’t get an actor in a location.”

The ability to create digital doubles, in turn, puts characters in precarious situations live actors would not be able to tackle themselves. Take possible contender Spider-Man, in which Tobey Maguire’s CG stunt double tumbles through a digitally-rendered New York.

For Minority Report, which featured a futuristic world made of digital sets and props – including robotic spiders and holographic simulations – five effects companies, including ILM, combined efforts. One of the film’s breakthroughs was a new use for photo modeling in crafting the look of the holographs of John Anderton’s (Tom Cruise’s) wife and child.

“We picked a technique of photo modeling – photo geometry – and used an image-based rendering system. Instead of capturing the environment, we captured the individual,” says Barry Armour, ILM’s visual effects supervisor for Minority Report, whose primary focus was the alley-chase sequence, the look of the Hall of Containment and the magnetic levitation system for the film’s highway sequences. For the holographs, Armour’s team captured the shot with Lara Clarke (Kathryn Morris) on an immense green screen, using 11 cameras for positional reference. The setup allowed them to create geometry on a frame-by-frame basis to give the shots a realistic look, yet allowed the camera to move around and see a rough profile. “We purposely degraded the image to give it a hacked look,” he says. “I think it’s the first time those techniques were used to capture a moving subject.”

Armour and his team used the same general technique for the bodies in storage in the Hall of Containment sequences, where 15 cameras were arranged around extras playing prisoners, and a setup was built that mimicked the lighting in the pods.

“I’m real pleased we achieved what we did with photo modeling and image rendering graphs,” he says, adding that what he’s seen in an academic atmosphere at SIGGRAPH, his team has now applied to film.

Next article
- Advertisment -


SISU Cinema Robotics Announces SISU Battery Box for Robotic Camera Equipment

SISU Cinema Robotics has been transforming the landscape of robotic camera equipment by introducing user-friendly cinema technology designed to keep pace with the demands...

Beowulf and 3-D