Move over green screens, LED walls are here.
When the pandemic hit, productions scrambled to figure out logistics in the new normal. That’s when studios turned to an emerging aspect of virtual production: digital immersive environments capturing real-world settings (including lighting) rendered in real time using game engines such as Unreal. They are then projected onto LED walls made up of hundreds of high-def displays that serve as interactive backdrops for actors — like a green screen, but much more sophisticated.
It’s said that necessity is the mother of invention, but in truth, VFX shops have been dabbling in different aspects of virtual production for years — with techniques being notably applied on pics such as Steven Spielberg’s “A.I. Artificial Intelligence” in 2001.
Given the restrictions on travel and social distancing regulations at many locations, however, the pandemic inadvertently sped up, specifically, the utilization of LED walls — given that only small crews, using cameras fitted with sensors that can track changing perspectives, are needed to capture the plates for the shots on location. These emerging tools became a necessity.
“COVID really accelerated the willingness to investigate this approach. It was cool and sexy before, but then it became practical and the only alternative,” says Demian Gordon, president and founder of the Motion Capture Society and a virtual production supervisor with credits that include the second and third Matrix films, “Polar Express” and the “Planet of the Apes” franchise.
“The technology is still in its infancy because it’s a relatively new approach; it’s very expensive,” Gordon says.
Productions are essentially taking signage that’s meant for displaying advertising or big TV images outdoors and repurposing it to be used on a film stage. “It’s not exactly meant to do what it’s being asked to do.”
But manufacturers quickly realized they could embrace this new market and are starting to deploy technologies to solve some of the existing issues, namely: cost, weight, how far apart the LEDs are spaced, how far they need to be from the wall, the time it takes to transmit the image to the LED wall, display resolution, LED brightness, how quickly they strobe, heat, and so on.
“People are just getting their sea legs with how to deal with this new toolset.”
Industrial Light & Magic made a lot of headway in the realm of virtual production tools on “The Mandalorian.”
“Our LED technology is really an outgrowth of work that we have been doing with LEDs for rear projection screens with pre-rendered content,” says ILM’s executive producer of virtual production, Chris Bannister. “We had done that on things like ‘Rogue One’ and ‘Solo,’ and then when ‘The Mandalorian’ came along, and Jon Favreau has done so much virtual production work on ‘Lion King’ and ‘The Jungle Book,’ it became kind of a perfect marriage of the technology work that ILM was doing and pushing it forward into the latest evolution. We took the latest in real-time rendering technology, camera tracking, and motion capture and put them together to make something that really hasn’t been achieved before.”
Studios are also having to re-think their entire way of approaching a shoot. Post-production is no longer reserved for merely the end of the shoot. According to Bannister, creative decisions are now made really early in the process and outfits like ILM are involved all the way through.
“With the rise of virtual production, the production pipeline is being flipped around. VFX studios are getting consulted now as early as the script phrase of a project,” agrees Vancouver’s Animism Studios’ co-founder and CEO Stephen Kelloway. “VFX is no longer just a component of post-production, it’s involved in every part of the filmmaking process and we’ve been working closely with productions to plan on how to shoot specific scenes and build up the CG assets required for principal photography.”
The technology is here to stay, says Bannister. “People are beginning to understand that it’s just more sustainable to be able to send a very small team out to capture environments and then bring them back digitally. And then you have a space where it can be controlled, your lighting can be controlled, you can move the crews really fast, you’re not pulling up everybody to move to a distant location, you can bring the location to you.”
Not only does the technology allow productions to rapidly move through many locations in a single day, it also allows them to go to sites that were completely inaccessible before — especially those impossible to lock down. Now these locales can be recreated visually on a stage, with just the push of a button.
“It’s that amazing blend between the practical and the digital that is so fun about this technology,” he says. “It’s not just a pure pixel box, it’s a practical set piece, and the LED screens and real lights.”
In many ways, as the cost continues to come down, this virtual production technology has the power to democratize film production. As more locations, and even props, are scanned into assets, databases that can be licensed by productions of various sizes will become available.
The technology isn’t reserved for merely film either. Transmedia veteran Jeff Gomez dabbled in virtual performance within the video game end of things early on and saw the potential for immense growth. This vision was further realized when worked with James Cameron on “Avatar” and witnessing first-hand the gigantic environment the likes of which he’d only seen in video games.
“The idea of creating a virtual world is rooted in video-game technology and in my mind it is most fully realized as a technique that would ultimately be affordable and doable in smaller settings,” he says.
Reading the tea leaves, in 2020, he co-founded virtual production outfit Iceberg Theory, which looked to take the video-game engine technology seen in shows including “Mandalorian” but done on a smaller scale, faster and less expensively.
Since its launch, Iceberg has worked on several projects including a virtual fashion show for a children’s boutique that not only attracted an audience, but also generated strong sales, as well as created a virtual fundraiser featuring Sheryl Crow and others. That event raised money for the cancer research organization Pelotonia, which was left in the lurch after COVID restrictions shut down its annual bicycle marathon. All of this was threaded together through the use of the Unreal game engine, with special effects and atmospheric video processing to create mood.
Most recently, Gomez got to represent Ultraman — the Superman of Japan. The brand is being reintroduced to the international market more conventionally through an animated series on Netflix as well as a feature film version down the road.
Gomez and his team were tasked with creating a virtual event in which fans weren’t just viewers, they were participants who climbed aboard a starship and blasted into space to visit Ultraman’s world. They also watched streamed content directly from Japan and were introduced to stars of the show.
“Fans were treated to the equivalent of a floor show,” he says, and as they exited through the virtual gift shop, they bought thousands of pieces of merchandise.
The big players aren’t wasting time, setting up virtual production stages in production hubs around the world, including in London, Vancouver and Los Angeles, as well as states such as Georgia, Ohio and Texas.
“Then you’re going to see the traveling version of that, that can pop up as needed,” says Gordon. This may also mean an increase in tax incentive opportunities for productions.
“I’ve been involved in it since the early days of the tests on the ‘Mandalorian,’ ” Bannister says. “And there are still days I go to the stage and it’s just an amazing magic trick. I really think we’re only getting started. As new storytellers get involved, it’s just going to continue to grow the opportunities of the technology and there’s just more and more stories that can be told with it.”
Source: Read Full Article