Erik Weaver is the Director of Adaptive Production and Special Projects at USC’s Entertainment Technology Center (ETC). At this year’s NAB conference, Weaver brought together a group of virtual production experts to discuss both the benefits and the pitfalls of virtual production.
The group included AJ Wedding, the Co-Founder of Orbital Studios, where he currently serves as Director of Virtual Production; Addy Ghani, the Vice President of Virtual Production at Disguise; Trusted Partner Network‘s Kari Grubin, who’s leading an SMPTE initiative on virtual production; and Ben Baker, the Co-Founder of virtual line production company Mesh, all of whom shared their expertise with NAB attendees.
Weaver kicked things off by asking the panelists whether virtual production saves money. The answer was a resounding yes, no, and maybe.
“Compared to what,” asked Wedding. His company worked on the fifth season of the cable series Snowfall, using virtual production instead of locations and saving an estimated $500,000 for the season.
Ghani reported that he watched a taxi cab scene in Manhattan being shot with virtual production. “They were only using a slice of it and I asked if they were really saving money,” he recalled. “The answer was, ‘yes,’ compared to shutting down a street in Manhattan, but ‘no’ compared to a greenscreen.”
Grubin pointed out that, like everything in production, “it depends [on] how disciplined you are,” which Baker agreed with. “There’s nothing an LED wall can do to save you money,” added Baker. “What saves you money is having a crew that knows what they’re doing. All that pre-organization is what saves you money.”
Weaver asked about 2D, 2.5D, and 3D virtual production content. Wedding said he’s been doing a lot of 2.5D. “It’s easier in car scenes as opposed to a process trailer [moving camera] and shutting down streets,” he said. “Also, the workflow doesn’t change that much — you’re able to work on scripts that come in the week of.”
Ghani noted that for shooting an outdoor environment with trees, one solution is to put a camera on a beautiful landscape, put it on a volume, and build the world in Unreal. “Or you can take the 2D video and slice [it] into layers of depth to create parallax,” he said. “Our eyes are super sensitive to the lack of parallax, and this is now a very popular option.”
Baker said what he learned on the ETC production of Fathead is that the kind of assets in a shot depend on where they are in the depth. “If you think of 12 feet into the environment, those assets have to be 3D. If you move the camera, you’ll ramp around those objects,” he said. “12 to 24 feet is 2D objects and you get parallax. But from 24 feet onward, it’s matte paintings because you don’t get parallax at that distance.” The difference, he added, is based on budgetary concerns. “The first 12 feet are photogrammetry assets, but you don’t want to build 3D Unreal assets to the horizon because your computer will choke and die.”
Grubin noted that, again, “it all comes down to planning.” “Photorealism is really important but it’s a much heavier requirement than a make-believe world or outer space.”
Baker agreed,” saying that “Anyone who thinks they’ll walk onto a virtual production set without planning will be burning dollars on set.
“No matter what technology you’re using, if you’re not organized, you’re not saving money,” added Ghani.
Wedding stated that “the beauty of virtual production is that you have a space where the departments can work collaboratively from pre-production onward. “Your expensive days are on the LED wall, but the pre-production days are where you save money,” he said.
The panelists also noted the difference between virtual production, which is real-time, and visual effects, which has its own, much more time-consuming workflow. Wedding said his company is working on “a more collaborative pipeline that takes less time and allows you to have eyes on assets all the way through the process.”
Grubin pointed out that content creators need to keep the target display/platform in mind which “can dictate the type of quality.” Another factor is if and how assets can be leveraged to build an ongoing revenue stream. “If it’s a throw-away, you can do things effectively and quickly,” she said.
Weaver described another way to create assets. “If you go to 3D procedural generation, one film created a 5-km. square forest, zoomed in, and found the place they liked the best — like scouting a location — and put their assets exactly where they wanted,” he said.
Wedding said that, although artists fear generative AI tools, “there are ways to train the models so they use your art and the tool does the busy-work tasks like rotoscoping.”
Baker noted that, for now, photoreal is a requirement — but that might change. “The audience is now coming up through Roblox and looking for something else, like Fortnite,” he said. “That’s what’s coming in the future — and that audience might not care if the tree looks photoreal.”
Ghani explained that “‘transmedia’ is the word we’re going to hear,” with assets that can be repurposed over and over again. Panelists agreed that, in the future, people will gravitate to Fortnite to see the latest creative content.
Baker noted that content creators need to think on a shot-by-shot basis whether they’ll be shooting in a volume on a big stage or using one of the many alternatives. “There is no point in shooting all the details and close-ups on an LED stage,” he said. “Tracked camera greenscreen has a place, for example. It totally depends on the shot. That’s where the savings come in.”
And to leverage those assets, Grubin stressed that, “we should have a conversation about keeping track of your assets — chain of custody, how you protect it, how you monetize it.” “With all this IP, we have to learn how to protect it to have a consistent revenue stream into the future.”