By Randi Altman
Virtual production workflows and technology have come a long way, even since our last virtual production special edition that ran in July of 2022.
More and more companies and productions continue to embrace virtual production, realizing that you don’t have to be making The Mandalorian to take advantage of this way of working. Thanks to more standardized workflows, planning, preproduction and affordable tools (yes, they exist), almost anyone can take advantage.
We reached out to those using virtual production and those making tools for virtual production. They talk trends, misconceptions, AI’s influence and more.
Enjoy…
Pier59 Studios Virtual Production Supervisor Jim Rider
Pier59 Studios has been a destination for photographers, filmmakers, designers and production companies worldwide for over 25 years.
How do you work in virtual production?
With the addition of our large virtual production stage in New York City, Pier59 Studios is now able to expand its services even further. The volume consists of a 65-foot by 18-foot main wall, a fully articulated ceiling as well as four wild walls. Our Disguise media server system can run multiple streams of high-resolution video to the volume screens, but when used in conjunction with Unreal Engine and our three Disguise rx II render nodes, the system is capable of running full real-time 3D content as well for a truly immersive shooting experience.
Are you using LED volumes or greenscreen or both? What are the benefits of each?
One of the main benefits of virtual production and in-camera VFX (ICVFX) is that the subjects are filmed in the volume with no need for greenscreen. That said, there are specialized cases where we can display green on the wall (or a small section of it) if necessary. For instance, we recently had to add one of our actors into a real-time Unreal Engine 3D environment for a shot. Given a short time frame, we shot her on a small section of green LED wall and then composited her into the Unreal Engine scene on the same day so that when we did the shot using that environment, our actor was already in the scene, with no need to create a digital double.
How are you maintaining the color pipeline and integrity of the image?
The entire LED volume is color-calibrated to a gamut very close to Rec. 2020. Incoming video is displayed using either a Rec. 709 or Rec. 2020 color transforms. Unreal Engine content is rendered using an ACEScg color space and then transformed in Disguise to Rec. 2020. A physical Macbeth color chart is regularly compared to a digital color calibrator chart in both Disguise and Unreal Engine to ensure color fidelity throughout the image-processing chain.
What is the best way to light the volume? Full-spectrum natural lighting?
The LED volume is designed primarily to shoot against, with the LED panels displaying the background content in-camera. Our ceiling and wild walls can also display content off-camera, mainly to use for reflections that might be seen in the subject. These surfaces are also providing additional ambient light for the subject, but currently LED panels are not capable of displaying the full spectrum of light necessary for accurate lighting (especially on human skin). So to augment the lighting provided by the LED panels, we also use full-spectrum (RGBWW) Quasar Science fixtures for image-based lighting (IBL). These fixtures allow us to pixel-map the LED wall content onto the fixtures to provide a full-spectrum, high-intensity light that matches the content seen on the LED wall. Natural, accurate lighting can also be provided by using traditional stage lighting, but the huge benefit of IBL is that the lighting automatically changes when the wall content changes, greatly reducing the need for complete relights between scenes.
How are you integrating lens metadata?
When using a camera like the ARRI Alexa Mini LF and an ARRI UMC-4 unit, we can stream metadata from the camera directly into Unreal Engine via ARRI’s Live Link Metadata plugin in real-world units, eliminating the need for any kind of lens tables. Industry support for a wider range of lens metadata is coming, though, and lenses will soon be able to store, say, distortion profiles and stream that as metadata to systems like Disguise, greatly speeding up that spatial calibration process. This also has great application for post/VFX, as there will no longer be a need to shoot lens distortion charts.
What are some best practices relating to virtual production?
The single biggest best practice is planning and preproduction. Because the goal is to capture the final image in-camera, “fix it in pre” is really the mantra we go back to all the time. A huge tool in this process is previsualization. Since content is often real-time content from Unreal Engine, previz is a perfect fit for the process.
As soon as work has begun on a 3D environment, previz can get started with making decisions about framing and camera moves. As the 3D environment gets developed toward final, the previz can be quickly updated as well. Discussions about pacing, design and lighting can happen early in the process. On the shoot, changes can always be made, but ideally, by that time most major decisions have been made. This can greatly speed up the shooting process.
What’s your take on the state of AI and how it might influence virtual production?
One thing we are seeing now is using AI as a tool in look development and mood boards. Many, many iterations of a concept can be created now. (Many will be garbage, but a few can really help spark some ideas.) Going forward, I see generative AI being used as a tool to help build out complete worlds. But rest assured, we’ll always need good artists.
What are the biggest challenges that VP faces in attempting to achieve fully convincing “real life”?
With advancements such as Lumen in Unreal Engine 5, and with talented artists, we are at the point where truly “photoreal” environments can be created. It is so much further along than even a few years ago. A remaining challenge is still that environments need to be highly optimized to be performant in real time on the volume. So there is always a trade-off between the ultimate best quality and performance. But this is nothing new. Even as hardware and software continue to evolve, the careful eyes of technical artists will still be needed to find that sweet spot.
Pixotope’s David Dowling
What tools do you offer for virtual production? Where does it fit into the workflow?
At Pixotope, we aim to simplify the overall processes of virtual production and allow more creators to create. To do that, we all must first have the same understanding of what virtual production is. In our book, it’s the tools, methods and services used to create immersive experiences that combine real-time virtual content with live video.
Our virtual production platform encompasses all the necessary components for real-time augmented reality (AR), extended reality (XR) and virtual sets, including camera and talent tracking. Pixotope seamlessly integrates with existing workflows and technology and simplifies implementation and adoption because it is specifically designed to combine with partner technologies and external data sources. Our solutions have been developed with the aim of reducing operational costs with reliable, easy-to-use solutions that require fewer people and less support to operate.
In addition to our software solutions, we have also launched the Pixotope Education Program, a community-oriented initiative for higher education institutions, in response to the current talent shortage facing the virtual production industry. This initiative provides students and educators with access to Pixotope software and tools to grow their skills and connects them with industry experts from the global Pixotope community. We work with the institutions to adapt the program to their needs in order to better support, train and inspire the next generation of virtual production talent.
Are your products used by those working on LED volumes and with greenscreen?
Yes, our solutions are used in a number of different settings, including use cases involving LED volumes. In fact, the Pixotope Graphics XR Edition was developed in response to the growing trend of using LED volume in virtual production and the need for hardware optimization. It includes a range of tools to simplify setup and operation by reducing the technical complexities and associated resource costs of XR workflows and environments. Its purpose-built tools address common operator pain points while removing the need for proprietary hardware, effectively bringing users an off-the-shelf XR solution.
A prime example of this use is in the recent coverage of The International Dota 2 esport championship. Media company Myreze used Pixotope Graphics XR Edition to create a seamless blend between the LED wall and the AR set extension and bring immersive viewing experiences to millions of Dota 2 fans around the world. The Myreze team noted that the color-matching tool in XR Edition saved them a lot of time.
What would you say to folks who don’t think they are large enough or have enough money to afford to work in virtual production?
Virtual production has seen rapid technological advancements and growth in the past few years. We are strong believers that, as a result of these continued developments, virtual production will not just be for large studios and productions.
Ultimately, it will become a mainstay of media production. All video-based content will eventually have the visual impact of high-end feature films thanks to the increased accessibility of virtual production technology. The result is that all businesses will adopt virtual production tools and workflows to create their video content.
This is, in part, where we come in. Pixotope solutions are designed to make virtual production available and accessible to all media producers. We believe that everyone who wants to leverage this approach to creative storytelling should be able to do so. To achieve this, we are continuously working on streamlining some of the more complicated aspects of the technology and workflows — like camera tracking, the complexity of setup and associated resource costs — to make it even more widely available.
What are the biggest trends you are seeing right now for VP?
We are seeing a move within the virtual production industry for use within corporate and broadcast. The mixed reality ad that razor company Gillette broadcast live during a 2022 NFL game showcases the ways in which virtual production provides a more attention-grabbing and interactive experience in a time when many broadcasters are fighting for views. Beyond this, virtual production enables personalization or segmentation for ads at a deeper level with greater ease.
Furthermore, increasingly easy access to high-performance computing via cloud and GPU, with photorealistic rendering and processing from modern game engines, makes it possible to create new types of media experiences and novel production methods.
Another significant trend that is changing the landscape of media production is the shift from proprietary hardware to out-of-the-box software solutions. It’s not uncommon for media professionals to rely on expensive, specialized hardware to power their workflows. However, with the increasing power and accessibility of high-performance computing, software-based solutions are becoming more viable and efficient.
What are some common misconceptions in relation to VP?
That it’s only a viable option for major studios and organizations with major budgets and that it requires extensive reconfiguration. The days of bespoke setups powered by on-site engineering teams are (almost) behind us.
Today’s tools and workflows are designed to easily fit into existing setups and simplify use, making it a viable option for everyone from YouTubers to regional broadcasters to Hollywood productions.
What’s your take on the state of AI and how it might influence virtual production?
AI has obviously been getting a ton of attention with ChatGPT and the like taking center stage. In the context of our industry, the real value that AI can deliver is in automating the complex processes that are required to leverage virtual production. For example, our automated color-matching tool for XR workflows matches the colors of the environment projected by the LED wall to the set extensions and augmented reality objects in extended reality. It sounds simple, but it’s a key step in ensuring that your XR production is capable of suspending disbelief for the audience.
What used to be an arduous and time-consuming task is now achieved in just several clicks. Even more impactful is the potential for AI-generated or AI-assisted creation of virtual sets, including LED backdrops and foreground AR graphics. These capabilities are coming sooner than you think as AI and machine learning continue to advance.
It will all serve to make virtual production even more accessible since the complexity of such processes will be further reduced.
Are your products SMPTE ST 2110-compliant?
We haven’t yet seen significant demand for ST 2110 in our customer base, with the majority sticking with SDI, opting for NDI or looking at protocols such as SRT for “remote” workflows.
Pomp & Clout Director/EP Kevin Staake
Pomp & Clout is a creative production studio specializing in everything from commercials to music videos, documentaries, narrative shorts, AR filters, VR experiences and spooky AI experimentations.
How are you working in virtual production?
In terms of virtual production, we’ve worked with greenscreen and roto for as long as I can remember. More recently, we’ve begun using photogrammetry/volumetric methods, but the newest and most exciting area of VP for us has been working in motion capture and Unreal Engine.
We produced a music video using Unreal for the Joji song “777,” which the talented Saad Moosajee directed. After this, we pitched an Unreal Engine short film to Epic Games for MegaGrant funding, which, by some miracle, we secured. This is a project I wrote and directed called Noah’s Belt, which is in the final stages of post production and will soon be coming to a computer screen near you.
Are you using LED volumes or greenscreen or both? What are the benefits of each?
When it comes to LED versus greenscreen, we’re using greenscreen more, mainly due to the fact that it allows our clients to change things more readily in post. We’re also working with AI tools to automatically roto and isolate characters from footage, allowing us to save valuable time on-set without the need to set up a perfect greenscreen.
But with that said, we’re itching for an excuse to use an LED volume, but in many cases, we’ve found it too expensive to justify. To be fair, it was a year or two ago the last time we bid out an LED shoot, so I’m sure costs have become more reasonable… I hope. But as far as the benefits, in my limited experience with LED, I understand a major one being the “natural” light-play onto the characters, rather than having the tug of war to evenly light a greenscreen while maintaining the right lighting on the characters.
I’m also convinced that actors yield a more convincing performance in a realistic LED environment/landscape, rather than in a world purely of chroma key green. But despite any benefits of LED, once you shoot, you’re married to that footage, like traditional filmmaking, whereas with greenscreen, there’s more flexibility to “fix it in post.” Pros and cons to both.
What are some best practices relating to virtual production?
In speaking to projects using performance capture, it’s important to remember you can put the camera anywhere (almost). We’re so used to these archaic standard framings in Wide, Med, CU, OTS, Reverse — all this stuff that’s easy to get boxed in with. One of the beauties of working in this medium is you can do away with all the lights, stands and scaffolding and instead spend an hour or two safely rigging a camera to a ceiling.
As long as you’re capturing correctly, you’re pretty much getting every angle/shot you could possibly need, so don’t forget to monopolize on that freedom and have the camera dolly right through a glass pane or push through an AC vent. You can even “bury” the camera six feet under and shoot it from below.
What’s your take on the state of AI and how it might influence virtual production?
I was hoping you might have the answer… I think VP sort of necessitates a human being as the subject, and with AI, there won’t be any need for a human to put on a silly leotard covered in pingpong balls. We’re already using Wonder Studio to automatically generate mocap data from normal footage, with no traditional mocap stage needed.
AI will be incredibly useful in environment and asset creation for VP, speeding up the process dramatically.
Ultimately, AI will lead to full-on photoreal text to video, but in the meantime, it’ll fill in more and more time-consuming gaps in virtual production, allowing us to spend more time being creative and telling stories.
Puget Systems’ Kelly Shipman
What tools do you offer for virtual production? Where does it fit into the workflow?
Puget Systems offers custom workstation computers for content creation, design, engineering, scientific computing and moreWe offer both render nodes to drive the LED wall and artist workstations for creating the worlds and assets that will be displayed.
Are your products used by those working on LED volumes and with greenscreen?
We work very closely with studios that do both LED volumes and greenscreens, though LED volumes are becoming much more common.
What would you say to those who don’t think they are large enough or have enough money to afford to work in virtual production?
While large and expensive LED volumes grab all the big headlines, there are many ways to use VP techniques on a smaller scale. In addition to greenscreen, where only a single computer is needed, we recently worked with a studio that only used a 3-foot by 4-foot panel. The same techniques still apply. There are many ways virtual production can be used outside of a massive LED volume.
What are the biggest trends you are seeing right now for VP?
The biggest trend I’ve been seeing is remote control of the studio. Tools are available that allow the director or other crew members to remote into the studio and control various aspects. This is great for any stage of production, but especially before the shoot. A director can view the stage, adjust lighting, etc. from a different city or country.
What are some common misconceptions in relation to VP?
That everything needs to be a full-3D environment. Many projects actually use still plates or prerecorded 3D video mapped to the LED volume. On a recent movie, the crew took video of driving the streets of Las Vegas at night, and with that, they could film all day long without needing an expensive car rig. Virtual production opens up so many different ways to tackle a shot.
What’s your take on the state of AI and how it might influence virtual production?
AI has some interesting use cases for previz and virtual scouting. One of the biggest draws to virtual production is the ability to get into the environment quickly and get all departments on the same page early in the process. Current AI tools allow for quick ideation that can be handed off to the virtual art departments to fill out. As these AI models get refined and more powerful, we’ll begin seeing them integrated into existing tools.
Are your products SMPTE ST 2110-compliant?
We are currently working with several partners to finalize the networking solution for SMPTE ST 2110.
The Studio at B&H’s Michel Suissa
What do you do, and what tools do you offer for virtual production?
The Studio specializes in completely customized applications and installations. We offer a wide range of tools for virtual production, including LED tiles, processors, media servers, networking, production equipment (such as camera, lenses and lighting), camera tracking, signal flow, software control and infrastructure.
Where does it fit into the workflow?
Our software and hardware solutions fit into any and all portions of the workflow.
Are your products used by those working on LED volumes and with greenscreen?
Yes. The solutions we offer can be used in both environments.
What would you say to folks who don’t think they are large enough or have enough money to afford to work in virtual production?
I would say they are right… and wrong. For studios, the initial investments will remain substantial, but there are new solutions on the market that make VP more approachable. For content creators, it’s a matter of education. Not every project should use VP, however it offers incredible creative solutions that are unique. Budget implications can make a project feasible when properly planned.
What are the biggest trends you are seeing right now for VP?
Image-based lighting will become a key part of the VP toolset in the very near future. Remote production is already in place.
The main misconceptions are based on lack of understanding of the technology and its applications. Prep and planning are still required. In fact, an incredible amount or preparation is necessary simply because the possibilities are so endless. Another misconception is that existing skill sets do not apply. That’s also not entirely the case and while the learning curves do exist, there’s a lot of talent out there who simply need to learn certain aspects of production to adapt. Positions are evolving and shifting. When and where contributors need to be in the production pipeline is radically changing.
What’s your take on the state of AI and how it might influence virtual production?
It already is influencing VP. It is hard to quantify just yet, but it will continue to grow exponentially.
Are your products SMPTE ST 2110-compliant?
Some of them are, but not all. The efforts of the SMPTE RIS group aim to bring greater standard practices to the industry.
ReadySet Studios Founder/VFX Supervisor/VP Producer, Dennis Kleyn
ReadySet Studios is a full-service virtual production stage in the Netherlands, specializing in in-camera visual effects (ICVFX) for feature, episodic and commercial workflows.
How you work in virtual production?
Our Amsterdam-based, 5,500-square-foot facility was conceived by the studio founders and the internal development team members and pivots on our extensive years of experience in virtual production and ICVFX development in the Netherlands, Europe and beyond.
As our name suggests, the concept behind ReadySet Studios is to provide filmmakers with “ready sets” that combine both physical and virtual sets and environments to create compelling storytelling experiences.
Are you using LED volumes or greenscreen or both? What are the benefits of each?
We predominantly use LED screens. The benefits of LED workflows are the real-time feedback and the collaborative efforts of the whole crew to create “final-pixel” results, but the content needs to be ready on the first day of shooting.
We sometimes combine chromakey techniques on the LED wall itself. This sounds like it beats the purpose of VP, but it can do wonders for creating alternative backup takes, and it will provide the best greenscreen quality in terms of equal distribution of light and saturation.
ReadySet Studios has the physical space to facilitate a greenscreen setup next to the LED screen in our studio if needed. Because our team has extensive filmmaking background, we are able and happy to help and advise all novice makers on making the best choice for their productions.
How are you maintaining the color pipeline and integrity of the image?
We have a very solid linear color pipeline that makes sure black levels are maintained and prevents the LED wall from displaying the infamous “grayish” look that we sometimes see coming from VP studios.
In collaboration with software company Assimilate, we are further developing the color grading and finishing functions in Scratch for efficient, real-time, 360-degree plate playback for a streamlined LED virtual production workflow.
What is the best way to light the volume? Full-spectrum natural lighting?
The light that the wall emits is a great aid to add realism and to integrate the talent and set in front of the screen. Using image-based lighting, the content on the wall is also emitting light in the correct brightness and color. But it’s not sufficient to act as a sole light source, mainly because it is limited in range. The quality to light skin tones properly is poor, so traditional on-set lighting is always required in tandem with content to work toward a successful final in-camera image. We expect we’ll soon have better light quality coming from LED sources.
How are you integrating lens metadata?
We use Stype RedSpy 3 real-time tracking in Unreal and Scratch and record all lens metadata with each take as part of our internal pipeline recording. This is a great backup that allows post facilities to use the on-set metadata for match-moving and other VFX-related workflows to, for instance, extend the LED wall in shots that are wider than the studio could initially cover.
What are some best practices relating to virtual production?
The recipe for success is bringing together the foreground, midground (often forgotten!), great content and on-set lighting — one of the key “connectors” between these elements. This is both creative and technical. But the most important aspect turns out to be communication. During preproduction, communication is essential between creative leads and creative stakeholders — the Unreal artists, the studio and, eventually, the virtual art department. You need all the heads working together on the set as much as they can to create the final images in-camera.
What’s your take on the state of AI and how it might influence virtual production?
AI is becoming a very influential new technique that lowers the threshold for our clients that are traditionally more limited to visualize ideas. We see a lot of concept art created with Midjourney, for instance. To control creative concepts and style, a human eye and execution are still required.
What we do use and see as a benefit is the development of new tools and techniques within software such as Unreal to help us build environments and worlds better, faster and with more realism. These tools eliminate some of the factors that are tedious or repetitive and enhance the quality by showing quick iterations to get to high-end results faster.
Assimilate’s Jeff Edson and Mazze Aderhold
What tools do you offer for virtual production? Where does it fit into the workflow?
Mazze Aderhold: Assimilate is the provider of high end camera-to-post software for virtual production, live grading, video assist, dailies transcoding and finishing. Assimilate offers Live FX and Live FX Studio. Live FX can be used at many points in the workflow.
In LED wall-based workflows, it can drive the volume with 2D, 2.5D, 360 and 3D content. It supports multiple camera-tracking solutions, from budget-conscious through high-end, and via its integrated stage manager, it can calculate the correct parallax based on camera position and angle toward the volume. Particularly with 360 content and Unreal Engine scenes, Live FX Studio allows for an easy-to-set up set extension in addition to the volume projection.
Another field of application is image-based lighting. Live FX can control any number of DMX universes via direct DMX, Art-Net and sACN. By color-sampling from the image content, Live FX controls stage lighting in real time, creating perfect reflections and shadows onstage.
In greenscreen-based workflows, Live FX serves as the live keyer and overall compositor for live previz. In addition to these features, Live FX can record all metadata on-set and prep it for post.
Are your products used by those working on LED volumes and with greenscreen?
Jeff Edson: Yes. Our users range from indie to highest level multi-cam LED wall productions. Live FX is suitable for both VP scenarios and can take over multiple roles in the entire workflow.
What would you say to folks who don’t think they are large enough or have enough money to afford to work in virtual production?
Aderhold: Virtual production is a multi-faceted branch of our industry that does not necessarily have to be expensive. For instance, prerecorded 2D/360 footage is easier, faster and more cost-efficient than building an Unreal scene. It is also much easier to project onto a volume with far more cost-efficient hardware, as you only need one single workstation, and the GPU for that workstation doesn’t necessarily have to be the latest and greatest card around.
Also, greenscreen is not dead! Virtual production using greenscreen is still a viable, cost-efficient and, more importantly, proven way to shoot. Tracking technology is also available for any budget by now, and Live FX supports all of them, from iPhone apps to the highest end hardware solution.
What are the biggest trends you are seeing right now for VP?
Edson: Image-based lighting is of great importance these days. Being able to have stage lighting reflect the environment on and outside of the LED volume to create realistic reflections and shadows helps immediately to create the perfect illusion.
Also, more and more DPs are using prerecorded footage over CGI scenes as their footage of choice for the LED volume.
It is very realistic, easy to shoot and suitable for many virtual production workflows, like car scenes or vast landscape environments.
What are some common misconceptions in relation to VP?
Aderhold: That anything virtual production has to involve Unreal Engine. High-resolution, prerecorded 2D and 360 footage and AI-segmented 2.5D scenes are viable and cost efficient options that in many cases work better than CGI content.
What’s your take on the state of AI and how it might influence virtual production?
Edson: AI is here to take away the mundane tasks in our industry. In virtual production, AI can be used for object- or depth-based segmentation and inpainting of 2D image content. This removes a lengthy and tedious rotoscoping task from the workflow and adds a lot of possibilities for workflows with prerecorded 2D footage — effectively turning it into 2.5D content. We are working with our partners to make such a workflow as easy and as streamlined as possible through Live FX.
Are your products SMPTE ST 2110-compliant?
Aderhold: All Assimilate products, including Live FX, will support Blackmagic’s new SMPTE ST 2110 DeckLink cards right from release. Beyond that, Assimilate is also working on a native integration using the Nvidia Rivermax SDK across all its products.
Silverdraft‘s Amy Gile
What tools do you offer for virtual production? Where does it fit into the workflow?
We design and build the complete rendering and computing infrastructure for the live stage. We integrate the computing stack to support putting the full virtual scene up on the LED and having it react to the camera and lighting with the lowest latency.
Are your products used by those working on LED volumes and with greenscreen?
Yes, anyone looking for the highest performance to support the most demanding, most photoreal environments for in-camera VFX.
What would you say to folks who don’t think they are large enough or have enough money to afford to work in virtual production?
It’s not even about the production. It’s about the shot. Any given shot could be a candidate for virtual production. Many shots can be produced cheaper with virtual production than with a physical location.
What are the biggest trends you are seeing right now for VP?
Digital pipelines being driven by SMPTE ST 2110 and virtualization for more computing density and efficiency.
What are some common misconceptions in relation to VP?
It’s not the solution to everything. You don’t have to do the whole production this way. Just figure out the best shots to take advantage of what it gives you. Vehicle process. Golden hour. Heavy VFX.
What’s your take on the state of AI and how it might influence virtual production?
For now, AI is mostly useful for concepting, prototyping and testing. AI generation makes it so easy to create storyboards and concept art with so many choices. But the final product still requires the human touch of the true artist.
Are your products SMPTE ST 2110-compliant?
Yes. Absolutely. We need the whole ecosystem to be compliant so that everything works better, faster and more reliably.
You may want to look at the HS Scope as a tool to ensure colour accuracy is achieved on camera based on the cameras actual electronic chroma signal response based on a colour reference chart calibrated to the ITU standards for video measurement. Here is a video that explains the shortfalls of current lighting measurement’s and how the HS Scope gives camera accurate readings.
https://youtu.be/MiW8YSI0M74
This also ensures colour accuracy is achieved in the video wall itself.