SIGGRAPH, the international computer graphics conference, wrapped up last week in Vancouver, where thousands of attendees gathered to share new technical innovations in the worlds of film, TV, games, and more.
Our tech expert surveyed the booths and sat through numerous demonstrations, and here are just a few of the standouts they saw that could very well transform the future of production one day.
Diversity on Demand
Advocates for social justice have long argued that diversity, equity, and inclusion cannot be achieved by simply ticking a box. The engineering team at Pixar heard that complaint loud and clear and has responded with… knobs and sliders.
For this year’s hit film Turning Red, the technical directors developed an improved version of their proprietary crowd simulation tool, known informally as Presto. The film takes place in suburban Toronto and is inspired by the childhood of Canadian writer/director Domee Shi.
When Shi found that the mix of ethnicities in the crowds of students in the middle school scenes did not accurately reflect her own experience, she asked the team to tweak them accordingly. Typically, this would require the labor-intensive process of swapping out the individual 3D characters manually for every shot. While time-and-budget-conscious producers may have rebuffed this creative change in the past, the engineering team was able to update their crowd generation software to procedurally transform ethnic identity percentages at the push of a slider.
In the example presented at SIGGRAPH, a scene in a school hallway was filled with students, 24 percent of whom were of East Asian descent. After pushing the slider, some of the original students were automatically replaced with new Asian characters, bringing the total up to 39 percent, thereby satisfying Shi’s vision.
While it’s difficult to believe that inclusion can be fully achieved with the click of a button, those buttons can work wonders in animation, as this development in crowd simulation from Pixar’s engineering team shows. Those who registered for SIGGRAPH but may have missed the Filmmaking Fever Dream: Crafting Turning Red panel can click here to check it out, though again, registration is required.
Cloud-Based Animation Studios
Hot off the announcement of its recent acquisition by Netflix, the Sydney-based post shop Animal Logic presented a case study for Nimble Studio, an entirely cloud-based production platform developed by Amazon. Like many companies during the pandemic lockdown, Animal Logic was forced to send its employees home to work remotely. Due to the massive amount of computing power required to render feature-quality animation, artists were unable to process shots on their own home machines. Even after lockdown requirements were lifted, visa restrictions limited the company’s ability to bring in the international workforce needed to staff their shows. While remote desktop software allowed access to company machines from home and came into heavy usage by many post shops during the pandemic, Animal Logic saw Nimble Studio as an opportunity to go one step further.
Nimble Studio exists fully inside the AWS cloud, and virtual machines can be created on demand. In the case of Animal Logic, the virtual workstations are primarily used to operate Unreal, the game engine that has gained popularity in the film industry as an animation and previsualization tool. Because Unreal is a real-time 3D game engine, animators can do their work in virtual reality mode. Instead of placing cameras in a scene via the somewhat unintuitive user interfaces that are traditionally used, animators can frame their shots and capture movements closer to the way a cinematographer would in real life. Directors and supervisors can then review work after and make on-the-fly adjustments in virtual reality mode as well.
With computing done in the AWS cloud, lag times between the physical machines and the animators at home are reportedly lower than solutions where artists are remoting into machines at the office in Sydney, increasing accessibility to artists anywhere in the world. Since the machines are created and deleted on demand, Nimble Studio removes the need to buy and maintain expensive, leading-edge hardware, allowing it to scale up or down much faster than a traditional animation company, hence the name.
To demonstrate the level of quality that can be output in its virtual studio, Animal Logic produced a short film titled Unhinged (watch the trailer below). The movie was completed primarily by a small team of generalist artists — another departure from traditional animation practices in which work is typically divided into hyper-specialized tasks.
Realistic Lighting in Virtual Production
Virtual Production is the most hyped trend in filmmaking today, but those who have firsthand experience shooting in LED light stages generally acknowledge that it still has significant shortcomings. One of the main drawbacks of shooting in these environments is the limited color and intensity output of the LED panels that are expected to serve not only as a backdrop but also as a light source as well.
A pair of techniques aimed at solving these issues was submitted by a team of researchers at Netflix, headed by Dr. Paul Debevec. In the world of VFX, Debevec is considered a pioneer for his developments in 3D lighting and environments via high-dynamic-range, panoramic photography. Those contributions debuted in The Matrix, specifically during the “bullet-time” sequences.
More recently, Debevec and the Netflix team have devised a method for spreading the intensity of super bright light sources over a wider area of virtual environment images in order to work around the limited power of individual LEDs. This method allows virtual environments to more accurately illuminate their subjects instead of relying on the support of traditional stage lights. Click here for more about “HDR Lighting Dilation for Dynamic Range Reduction on Virtual Production Stages.”
The Netflix team has also put forth a method for fixing the problem of color accuracy when using LEDs as light sources. Because LED panels for light stages come in red, green, and blue, they can mix together to create the appearance of any color, but unfortunately, they still leave gaps in the color wavelengths between R, G, and B. The team’s second paper describes how to adjust the capture of images using these panels so that the colors are reproduced accurately. Click here for more about “Jointly Optimizing Color Rendition and In-Camera Backgrounds in an RGB Virtual Production Stage.”