By Oliver Peters
Thanks to the advances in video game software and LED display technology, virtual production has become an exciting new tool for filmmakers. Shows like The Mandalorian have thrust these techniques into the mainstream, and to meet the demand, companies around the world are creating virtual production soundstages, often referred to as “the volume.”
We recently spoke with Pixomondo and Trilith Studios about their moves into virtual production.
Pixomondo
Pixomondo is an Oscar- and Emmy-winning visual effects company with multiple VFX and virtual production stages in North America and Europe. Its virtual production credits include the series Star Trek: Strange New Worlds and the upcoming Netflix series Avatar: The Last Airbender.
The larger of the two virtual production stages at Pixomondo’s Toronto facilities is 300 feet by 90 feet and 24 feet tall. The LED screen system is 72 feet in diameter. Let’s see what Pixomondo’s head of virtual production, Josh Kerekes, has to say about this part of virtual filmmaking.
Why did Pixomondo decide to venture into virtual production?
We saw the potential of this new technology and launched a yearlong initiative to get our virtual production division off the ground. We’re really trying to embrace real-time technology, not just in the use case of virtual production in special studios but even in traditional visual effects.
Aside from big-budget films and TV series, what about smaller corporate and commercial projects?
We decided to build Stage 6 specifically geared toward mid- to large-tier commercial productions. It’s located in the heart of the film district of Toronto. The “volume” itself is just over 64 feet in diameter and about 22 feet tall. The ceiling is fully movable, and you can articulate any number of sections of the ceiling. That was motivated by traditional car process work, where people wanted the lighting to get really close to the car for reflections and those glancing angles.
What is Pixomondo’s approach to creating the virtual environments and sets?
We use the Unreal software for our virtual sets. Typically, we request a minimum of three months’ lead time to build a fully unique, original environment for a large-scale episodic production. If it’s a commercial, it would be rare to have three months. However, there is a lot we can do in a short amount of time. Fortunately, we have an ever-growing library of our own virtual assets. There are a lot of existing libraries, as well. One of the beauties of real-time rendering technology is the freedom to completely change your environment and prototype ideas and get a feel for things in a very short amount of time.
When it comes to finalized imagery in-camera, that takes additional time. To be successful, you need to be constantly iterating but also keeping in mind that the end goal is photorealism. Our approach is to be constantly rendering, or “baking in,” our lighting and to work with the highest-quality assets possible so that there’s no last-minute rush to optimize and increase the fidelity of the assets.
Please walk me through your methodology on a typical virtual production.
It’s not a standard VFX vendor-client relationship. We need to seamlessly integrate with the filming crew and collaborate with the production designer, art directors, the director and the director of photography. Our virtual art department will work with the production crew all the way through concept and prototyping the layout. We work with them to ensure that this layout translates to construction and set dressing so that they can build the corresponding practical elements.
We typically bring in the director of photography during the realization phase. The DP will work with our virtual writers, our virtual gaffers — initially with broad strokes to get the feel and the mood. After that it becomes granular, with very specific virtual lights to get the intention across well in advance of ever coming to the set.
How interactive and responsive is the ability to tweak things once you get on-set?
To get the highest-quality lighting, you do need to render — “bake in” — the lighting and shading information. That baking process can take anywhere from between one hour to 24 hours. You want to ensure that the DP’s vision is maintained. So we offer DPs the opportunity — preferably two days in advance of principal photography — to come in for a final review. We want this to be a seamless process. On a traditional film set, if the DP wants to change the lighting, that’s no problem. We want to ensure that we can accommodate that as well. But to do that, we just need that time.
Everyone’s goal in virtual production — especially Unreal Engine, the makers of the software we use — is for a more real-time and dynamic creation process. But we aren’t quite there yet. The technology of in-camera visual effects through virtual production is still in its infancy. This is generation one of virtual production, and it’s going to take the industry as a whole to continue to progress and push it forward.
Is virtual production right in all cases?
I spend more of my day actually discouraging the use of virtual production when it’s not right for the show. The last thing we want to do is shoehorn the technology into the wrong production. We want to pick the right environments, the right shot, so that it can be successful for the show’s sake and for our sake. We’re not shy about telling a client that this might not be right for them. However, we’re also trying to educate clients so that they can begin to write for the technology and craft their story around it — to work within the box, so to speak.
Is there any rule of thumb regarding the cost and benefit of virtual production?
It’s all over the map. One thing that’s been very successful is the amortization of sets. An episodic television show, where you might be constantly repeating the use of standing sets, might be a very viable candidate for virtual production. Instead of occupying an entire soundstage just for one set or storing that large set in a warehouse and then rebuilding it for each shoot, you can just store the practical set dressing. For instance, the vast virtual set for the USS Enterprise engineering bay in Strange New Worlds is stored on a hard drive and can be recalled at any point. That’s a huge savings.
The return on investment might also come purely from logistics. Instead of bringing the whole company out to a location, keep the company on the lot, where they’ve already been filming. They walk next door and can be anywhere in the world.
I never tell people it’s a tool to save money. I don’t think that’s the right way to frame it. Virtual production is a powerful creative tool that lets you make decisions live on-set. That’s where the real movie magic happens. You’re getting all the key stakeholders working together, collaborating with each other to get the best end product. You’re not doing it months down the line in a blackout studio, which in my opinion is not the most conducive place for creativity. There’s so much more to it than just the big, flashy LED screens. The really powerful thing about virtual production is that it allows every member of a production to begin collaborating with each other in this virtual space well in advance of shooting.
Trilith Studios
Trilith Studios is one of the largest purpose-built movie studios in North America, with over 700 acres of stages, workshops and community space located in Atlanta. This summer Trilith will open its Prysm stage within an 18,000-square-foot soundstage. It features an enclosed 80-by-90-by-30-foot virtual production volume.
The Prysm stage is just launching this summer, but some of the other soundstages have hosted filming for large productions, including Spider-Man: No Way Home and WandaVision. We spoke to Trilith’s director of creative technologies, Barry Williams.
What prompted Trilith to build its first virtual production stage?
Virtual production is revolutionizing the way we make films, and it’s only getting more capable. The latest technology allows crews to change “locations” by simply swapping out a background and allowing the final film assets to be instantly changed, instead of waiting weeks in post production. The power of technology is providing a seamless creation process and ultimately a better experience for film crew and viewers worldwide.
Can virtual production benefit smaller projects, like commercials?
Part of the benefit of virtual production is how efficient it can be and how quickly you can change the environment, so it’s something that’s scalable for more projects than you might think. It isn’t just for big blockbusters.
How involved does Trilith get with productions working with virtual sets and environments?
Typically, a production has their own team of creators designing the look and feel, and then our team works directly with those people to bring that imagery to life. We can also help design from the very beginning. We also have a few VFX partners on-site, like The Third Floor and The Imaginarium Studios. It’s an ecosystem approach with the benefit of bringing together creators who don’t normally cross paths.
Virtual production comes in many forms, but the Prysm stage is designed for in-camera visual effects, meaning the set background is displayed in real time on LED screens behind you. It’s most helpful when you need to combine the physical and digital worlds, avoid physical travel to an alternative location or are trying to avoid weather. When a production decides to use virtual production, our team will coordinate with their virtual art department during preproduction to finalize the artwork that will be on the screens. We will typically have a few prep days to test the images on-screen and work through any tweaks that need to be made.
How interactive and responsive is the ability to make adjustments once you get on-set?
You want to get things as close to camera-ready as possible with virtual productions. The goal is to shift resources into the preproduction process and build visual assets ahead of time. However, the engines that power these LED walls are very powerful and flexible, so you can make adjustments, like lighting or fog, in real time once you’re on-set and get inspired by a different look. Typically, the elements that a director might want to move on the day of a shoot are identified ahead of time, so those controls are open and ready to be manipulated by the operators running the real-time scenes.
Virtual production has gained a lot of awareness. More production facilities are building their own virtual production stages. Is virtual production right in all cases?
We believe the best solution is the one that works best for what you’re trying to accomplish. Virtual isn’t always the solution, but in cases where you have multiple shots within the same environment, it can be the most economical way to capture those shots. Some directors might not want to commit to some visualizations ahead of shooting; they might want to use a greenscreen and post production VFX instead. It’s all about what creative process a filmmaker prefers, and you have to weigh the benefits for each project.
Oliver Peters is an award-winning editor/colorist working in commercials, corporate communications, television shows and films.