NBCUni 9.5.23

Virtual Production Roundtable

By Randi Altman

Virtual production (VP) is here to stay. And while tools and workflows keep developing, best practices are being put in place and misconceptions are being debunked. For instance, you don’t necessarily need an LED volume for virtual production, and you don’t have to work for a big company to take advantage of some of the tools available for VP.

What are the benefits? Well, there are many. The limitations of time and location are gone, as are concerns about magic hour or wind or rain. And creative decisions can be made right on the stage, putting even more control in the hands of the filmmaker.

But VP workflows are still developing, with many companies just creating what works for them. Those finding success realize you need to wear a variety of hats to make it work. And prepro, tracking and education about the process have never been more important.

To find out more about what’s happening in virtual production, we reached out not only to some of the companies that make VP tools but to some of the ones who put those tools to use. They told us what’s working and what’s needed.

Magnopus’ Ben Grossmann

Magnopus a content-focused technology company bridging the physical and digital worlds. They have 175 engineers, designers and artists working in the US and UK and focus on AR, VR, virtual production and metaverse-related technologies.

Virtual Production Roundtable

Ben Grossmann

Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?
Some parts are consistent, but a lot of people are trying different things to improve the workflows. They’re still pretty rough. Collaboration is still hard. A lot of different companies have their own recipe for success that probably suits their business strengths more than anyone else’s. Sometimes they’re confusing the market by inventing funky names for things that aren’t really unique at all or use jargon that people don’t understanding consistently. In the end, this hurts everyone because if productions can’t understand virtual production, they aren’t comfortable. The more we make it complicated or unclear on purpose, the less people want to do it.

Can you talk about pain points related to VP workflows?
That question could turn this magazine into a book. The biggest challenge is the complexity of creating assets that are photoreal and performant. Unless you’re playing back footage that was shot, you’re probably having to create the environments from scratch, even if you captured them with photogrammetry. You can take some shortcuts with matte paintings, but creating content for virtual production can be a heavy lift, and producers aren’t used to planning for it. They’re often uncomfortable front-loading that much money in a budget.

An LED volume in the final stages of construction and calibration through a partnership between MBS, Fuse TG and Magnopus at Gold Coast Studios in New York for an upcoming production.

If you’re shooting in an LED stage, the budget for that remains a challenge.  The costs for all these things make sense when you investigate them closely (and they are generally reasonable), but productions haven’t gotten comfortable or confident with them yet. They’re not cheap and sometimes “what you need” and “what you’re paying for” can be unclear.

Aside from those two items, we could really use another year or two before the software gets more stable. It’s rarely “smooth sailing” right now (anyone who says it is probably spends most of their time saying, “You can’t do that”). But it absolutely works if you pay close attention to it and have a supportive director, cinematographer, and production designer.

What tools are paramount to virtual production workflows?
You need a game engine, and you need something to sync assets across all the people collaborating. Most commonly, that’s Unreal Engine and Perforce. You also need filmmaker buy-in and be comfortable. Without those things, you’re going to have a bumpy ride.

What would surprise people about virtual production workflows? 
The need to make assets before you shoot them and the time it takes. That seems so silly, but people have gotten so used to “digital” happening months after we shoot that when you say you need to start building months in advance of shooting, they don’t believe it.

Director/EP Jonathan Nolan has been collaborating with Ben Grossman and the team at Magnopus, in association with Kilter Films on the Fallout TV series.

Physical set construction has had a hundred years to mature. Crews are really fast and have the experience to be pretty accurate about time and cost. Virtual art departments haven’t. And you don’t want to show up to shoot on an expensive LED stage and go, “Nope, this looks like a video game.”

How can those who don’t have access to big stages work in VP? 
You definitely don’t need an LED stage for everything. If you don’t need the reflections and lighting integration on the live-action plates, then you can use virtual production on greenscreen and still see the composite for editorial with a real-time keyer. So you still get benefits. If you don’t need a stage because the content is mostly CG and not live action, then you can work in virtual production and go for final shots right out of the game engine for a lot of things.

I’d also say that you could replace previsualization with virtual production. People will argue with me here, but most filmmakers understand previsualization to be, “I give a script and some storyboards to a team of people, who make shots and edits from them that we can use as reference. We give them feedback, and they revise them.”

EPs Jonathan Nolan, Lisa Joy, Athena Wickham and Margot Lulick have been collaborating with the teams on developing this tech and production for the past two years. The volume is 74 feet wide by 91 feet deep and 22 feet tall.

Whereas virtual production is: “We build out our sets and maybe some animations, and then the director and cinematographer go in and shoot the scenes like they would in a live-action set.” There’s stronger creative ownership in the latter than the former. It doesn’t always work like this, of course, but I’m summarizing general perceptions here.

What is the biggest value of VP?
Creatively, everyone sees what movie they’re making at the same time. If all we can see is a part of a shot when we’re looking through the camera, then everyone will imagine something different, and it becomes a lot harder to make the movie. Biggest value of virtual production is that it brings much of the creativity from post up into production.

What is the biggest misconception?
It’s cheaper! Sometimes, yes. Someday, definitely. Today? You’ll have to work hard and smartly to make that true.

SGOs Adrian Gonzalez

SGO’s Mistika combines a highly efficient workflow with image quality that enables the complete creation of any immersive content, including virtual sets — from initial optical flow stitching, color grading and VFX all the way to automated final deliverables.

Virtual Production Roundtable

Adrian Gonzalez

Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?
Based on the conversations with our clients who work in VP, there appears to be some ground rules the vast majority is following at the moment. However, it seems that general industry standards are yet to be built. As a (smaller) post production software developer with extensive experience designing innovative post workflows, especially in the immersive area, we have the ability to adapt to market needs and quickly deliver specific features that facilitate emerging workflows, including virtual production.

Can you talk about pain points related to VP workflows?
For us, one of the biggest challenges related to VP now is the fact that there are no industry standards for display/projection formats and resolutions. Literally every virtual set out there has a custom configuration.

What tools are paramount to virtual production workflows?
From a post technology point of view, we believe that fast storage and GPU-optimized solutions that offer professional color management workflows and support several different industry-standard formats can truly be a lifesaver for content creators.

What would surprise people about virtual production workflows?
Perhaps the fact that at the moment every VP production is an R&D project in itself, as it almost always includes an innovative aspect and requires teams to adjust the (post)production workflow to the specific virtual set and not the other way around.

What about color accuracy in VP?
Color-aware workflows are a must in any professional production, including VP. When capturing in a virtual setup with LED screens, several different color spaces need to be handled. Also, the color accuracy and continuity can only be successfully achieved through industry-standard color pipelines, such as ACES.

What is the biggest value of VP? 
Some of our clients who regularly work with VP would say that one of the most important aspects of producing content in a virtual set is that the crew and the actors get a real feeling of what will be seen later on screens. However, the other big value of VP is that nothing needs to be set in stone, as the technology provides the content creators with extreme levels of control, flexibility and creative freedom to change almost anything on the go.

What is the biggest misconception?
Perhaps one of the most common ones is seeing virtual production as a direct substitute for a more traditional way of producing media content. We think of it as a creative alternative. Having said that, sometimes it is thought that VP is reserved for Hollywood-budget feature films only. But if planned wisely, it can even optimize shooting and post time and consequently reduce the overall budget.

Finally, how can those who don’t have access to these big stages still use your product or still get involved in VP?
Our products are available on the SGO website, so anyone can download them and try them out completely free of charge.

Hazimation‘s Hasraf ‘HaZ’ Dulull

Dulull is a London-based VFX supervisor, technologist, producer and director, who runs Hazimation, which provides animated feature films, video games and content for the metaverse.

HaZ Dulull

Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?
If we are referring to VP LED volume shoots, then I still think people are still trying to figure things out (which is great), but there is standardization happening with the setups. For example, you need a good LED wall, good tracking (like Ncam) and a system to get your Unreal Engine scenes managed onto the LED (like Disguise). There are also workshops taking places in big LED volume stages for training the next generation of VP supervisors, DPs and filmmakers.

Can you talk about pain points related to VP workflows?
I think we still have to combat things like moiré on the LED screen when shooting too close to it or focusing on it. Another one is that setup does take a while, and when you reset, it takes a while to recalibrate, so that can be a pain when you are doing aggressive shoots on a tight schedule.

What tools are paramount to virtual production workflows?
A good tracking solution to sync the camera with the virtual camera and Unreal Engine scenes should always be optimized and tested constantly to ensure we hit the frame rates required.

What would surprise people about virtual production workflows?
That it takes time to set it up right; it’s not just putting up the LED wall, plugging in Unreal Engine and your camera and — boom — off you go! There is a lot of testing required as part of the workflow, and each VP project shoot is different in terms of scope and what is required both on the set and in the virtual world (the LED wall content).

What about color accuracy in VP? Difficult to achieve?
You know, when I was moderating a panel about VP recently, we had the awesome Paul Debevec on, and when I asked him that question, the first thing he did was whip out his light meter and measure the luminance and RGB values coming from the screen. So to answer your question, I think it’s about having the DP and the colorist work closely with the VP supervisor on the shoot to ensure the best color accuracy possible when shooting.

What is the biggest value of VP?
You can shoot contained and not worry about weather or about racing against sunlight to make your shoot day.

What is the biggest misconception?
I have seen some people shoot VP just by having an actor stand in front of the screen and that’s it… that’s the shot. It pains me to think they could have done much more that. They should be using the screen not just as a visual background but also as part of lighting the scene. And they should use as many real props as possible to help the integration and have actual real parallax.

The other misconception is that this is cheap. It’s not cheap at all, but if you are smart with how you use VP and spread it across your movie or TV show, then it can be efficient both production- and budget-wise. But for the love of god, please don’t just shoot VP for one shot that could have been achieved with rear projection or greenscreen.

Pixotope’s David Dowling

Pixotope is a virtual production solution for live media production. Offering both 3D real-time graphics and camera tracking, Pixotope helps content creators and broadcasters to produce XR, AR, virtual set and mixed reality content for television, online streaming and film.

David Dowling

Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?
Yes and no. Yes in the sense that the technology is mature, integration between components works, and there are people and teams out there that know what they’re doing. We spend a lot of time making sure our user experience is intuitive and stable. It’s not a science experiment.

No because users of VP are pushing boundaries with the technology to find new ways of storytelling and engaging audiences. In places, this breaks from the norms of production workflows and requires a new approach.

Can you talk about pain points related to VP workflows?
Listening to customers and the market in general, the top three pain points are almost always the same.

The lack of available talent is an industry-wide issue but is especially acute within virtual production, where a mixture of traditional production skills and a knowledge of 3D graphics workflows is required.

WePlay AniMajor

We’ve had success working with on-set media producers to help them learn and leverage virtual production tools and workflows as a companion to what they already do. For example, lighting designers mirroring their work in the physical world with virtual lights and reflections. In this way, virtual production simply becomes media production.

For the next generation, the Pixotope Education Program (PEP) is supporting universities and other educational establishments by giving them access to VP tools, experts and industry contacts.

When it comes to camera tracking, we’ve seen a lot of virtual productions struggle with getting the right data at the right time to make sure virtual and physical worlds align correctly. Here, the remedy is to make sure the tracking technology meets the requirements of the production and then ensure seamless operation with the graphics engine.

League of Legends

On the third point, a lot of complexity comes from trying to use tools in applications they were never really built for. We’ve seen how powerful Unreal Engine is, and as the underlying engine, it dominates the VP scene. However, on its own, it can lack the integrations and glue to make it reliable and easy to use in a studio and/or live environment. By building those integrations and glue around it and simplifying the UX, we can significantly reduce the complexity and increase the reliability of VP workflows, meaning less time and resources are needed to run productions.

What tools are paramount to virtual production workflows?
An implementation of Pixotope graphics and tracking.

What would surprise people about virtual production workflows?
We’ve often said that as adoption increases, virtual production will simply become media production. But as I’ve touched on, this is already the case with VP becoming just another element of production — whether you’re a set or lighting designer, a camera operator, a producer, etc.

Baltimore Ravens

For a recent production, we worked closely with a broadcaster to implement AR objects in a set. Once the various disciplines were familiar with the tools and workflows, it was very natural for them to be working with the virtual elements as they did the physical. Many on set were surprised how this became almost second nature, though none of them was more delighted than the health and safety advisor with the massive reduction in staff climbing ladders to adjust props!

What about color accuracy in VP?
Poor color accuracy, or perhaps poor color matching, can be a big “giveaway” in virtual production, especially in mixed reality productions that combine XR and AR set extensions or props. While the computer-generated AR elements can be created to precise color profiles, camera sensors can change the colors captured…and LED volumes displaying virtual backgrounds even more so.

Setting up a key

To overcome this, we developed an automated tool that enables users to compare and adjust the colors of the AR elements such that the end result matches perfectly. It’s these kinds of setup tools that can make the difference between complex/time-consuming and it simply ”just works,” enabling media producers to focus on excellent creative.

What is the biggest value of VP? 
Unconstrained by the physical, the most powerful aspect of VP is that anything is possible — the limit is in the creativity. It could be argued that this is even limiting the adoption of VP; some potential users simply don’t know what to do with it.

VP can also significantly reduce costs. With faster turnaround times between sets and scenes, some users have been quoted saying it represents a 40-50% savings in time (and therefore money) compared to equivalent soundstage or backlot shoots. That’s exciting, but where VP is used to best effect is in driving audience engagement and enhancing storytelling.

What is the biggest misconception?
Probably that VP is hard and/or expensive. Sure, there will always be the big productions with huge, high-end LED volumes with headline-grabbing price tags. But getting into VP doesn’t necessarily require a massive budget. Subscription-based software running on commodity hardware can make VP affordable and provide an easy route into using virtual sets or adding photorealistic AR set extensions and props into existing productions.

The Family’s Steve Dabal

NYC’s The Family a new virtual production film studio in Brooklyn with 3D animation, Unreal Engine, Disguise xr, Nvidia Omniverse and an LED wall.

Steve Dabal

Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?
Every week feels different. Workflow standards are falling into place, with institutions like SMPTE, VES and ASC leading the charge. Hollywood is making it happen. However, our NYC productions tend to be with independent filmmakers, so we deal with bespoke workflows.

For example, we’re doing a feature that requires the virtual art department to have photogrammetry of miniatures in Unreal Engine 5, which is a whole digital/physical workflow of its own. So depending on how the above-the-line team works, we build digital systems to support.

Can you talk about pain points related to VP workflows?
The most significant pain point for us is overhead. The processing power and hardware costs are not cheap. Don’t even get me started on what it was like getting Nvidia cards. Another pain point is that virtual production changes the cash flow to be front-loaded to preproduction, so clients have a hard time understanding that they need to release funds earlier. Fortunately, we already see hardware that could allow for lower-cost VP productions, so these problems might solve themselves.

What tools are paramount to virtual production workflows?
How nerdy should I get? On the surface level, we want production to feel as standard as possible when using these tools. Some individuals are comfortable with Google Docs, whiles others use Movie Magic. Figuring out the workflow in which a person or team is most comfortable is instrumental in curating virtual production tools.

Often, the backbone of a workflow is a pen and paper, so we are working with technology startups to scan documentation, digitize it and automatically get it in 3D space. Those are coming soon and will be exciting to share. We’ve found great success in making sliders and buttons on an iPad to control and manipulate Unreal Engine scenes. Same with DMX tools for lighting systems. I think our list of tools would be too long to fit on a web page.

What would surprise people about virtual production workflows?
My goal is to preserve the most natural way an artist works. So if a VP workflow is designed right, it feels more like being on-location than it does a VFX shoot. Digital tools should make the process more accessible and streamlined. I don’t want to be someone who glorifies this trend by saying, “Virtual production makes filming easier” because that’s not the case for any of our crew behind the scenes, but for the storytellers, it is pretty accurate.

What about color accuracy in VP?
This is a big one that I wish there were more standards and practices for. Coming from the VFX world, this is one of those questions that is very complex to answer. For the most part, we’ve been taking an approach of letting DPs and colorists make a show LUT that they apply to camera feeds and then a content LUT that they use for the LED content. Again, let them work how they naturally work. But the honest answer is testing — lots of lighting tests, color charts, camera tests and beyond.

What is the biggest value of VP?
Accessibility. We can now produce a rough cut of a movie in Unreal Engine before starting production. You can block out scenes with lighting in preproduction without needing to rent equipment. It doesn’t matter where in the world you are.

We just filmed some scenes outside a Malibu beach house, but we stayed in New York. Granted, filming in a studio isn’t a new concept, but the entire prepro process was remote leading up to one day. So many of these things were never accessible before.

What is the biggest misconception?
Giant companies are putting massive IPs on this technology, but it is still early. It’s so early. It is a feat to get all this technology to look right and design a workflow that can function for multiple varieties of projects. When we do commercial work, we’ll have clients who see the first iteration of a CG environment and say it looks “too much like a video game.” And it does, because it is made in a video game engine. It’s not until you add the cinematic skills from filmmakers that the scene will work. But they don’t even take the time to think about how absurd real-time animation is. They’re spoiled already! We’re not rendering a still frame here. Recently someone told me that yesterday’s miracle is tomorrow’s expectation.

Ncam’s Nic Hatch

Nic Hatch is co-founder of Ncam, which offers virtual production technology and solutions for film, TV and broadcast. The company’s solutions include Ncam Reality, which enables virtual production through real-time camera tracking.

Nic Hatch

 Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?
The techniques have been around for a long time, but not many productions were using them. Some of the big pieces existed, like LED walls, game engines and the type of in-camera VFX that Ncam Reality enables. But other pieces are either brand-new or nonexistent.

Virtual production is now toward the top of the early adoption phase and heading into early majority. It’s been used successfully on a number of productions but is still far from plug-and-play. The off-the-shelf tools aren’t all there, so studios are creating custom solutions to make it work. The early pandemic stage created a need for wider adoption of virtual production, but not every studio had the means, team or tools to put it into practice.

Now we’re seeing some big leaps in terms of ease of use, affordability and interoperability. Solutions are starting to appear from various hardware and software companies, and the space is rapidly evolving. It’s certainly an intriguing time.

Can you talk about pain points related to VP workflows?
Arguably the biggest struggle is the idea of changing your entire workflow. It used to be that filmmakers would come to set with just an idea. In virtual production, you have to come with most of your 3D content already finished. This means a lot of the creation process has to be done weeks or even months earlier than people are used to. It takes a lot of planning. But in the end, it gets everyone on the same page much faster.

Then there’s the shortage in skillsets and talent. How do we train and/or translate current skills? There have been some excellent initiatives from the likes of Epic with the Unreal Fellowship, and we opened two new training spaces in Europe and Latin America in 2021. However, the industry needs more opportunities, and this is a global issue.

How are your tools used in virtual production?
Real-time camera tracking is a key component to virtual production, whatever your definition of that term. For any in-camera visualization, whether replacing greenscreen with a previz model or finished ICVFX on an LED volume, accurate and robust real-time camera tracking and lens information — including distortion and focus, iris and zoom information — is a vital part of the workflow.

What would surprise people about virtual production workflows?
There are so many VP workflows and different ways to visualize content depending on your budget. In terms of the technology, affordability might be a surprise. You don’t necessarily need an LED wall or the best camera and lens package to create in-camera visualization.

When Unreal Engine announced it would be free to use for linear (film and TV) content, we recognized this would be a catalyst for VP. We wanted the world’s most advanced and flexible camera tracking technology to be affordable for everyone, so we worked hard to lower the barrier to entry. We spent a lot of time ensuring that the hardware was completely decoupled from the software in order to offer mounting and budget flexibility. We also decided that software licensing should be project-driven in order to support the industry and how people are used to working. This was a lot of work, but ultimately it changes the game.

What about color accuracy in VP?
Color accuracy is increasingly important. As ease of use dictates how scalable this technology stack becomes, easy and accurate color and color matching are critical, especially when mixing the real and synthetic worlds; humans are very good at spotting what is CGI, and color is a real giveaway.

What is the biggest value of VP?
When you can see everything in-camera, you can make better decisions. From a filmmaker perspective, you’re aligning everything through the camera again, which restores some of the creative freedom they had before visual effects came along. And if we design the tools correctly, we won’t have to give up anything. You’ll have ultimate flexibility — so you can get the shots you want on-set but also be able to tweak them easily in post.

What is the biggest misconception about VP?
That it’s scary and difficult. That it takes too much time and costs too much money. That it’s only for the trailblazers. That it’s just a short-term fad.

VP is a massive step change, but there is an inevitability about it. As the tech stacks become more affordable, more usable and more integrated, it’s really a matter of skillsets and training. Part of me wonders why it’s taken so long to get this far, yet I also recognize that humans, as a whole, are resistant to change. But this isn’t a zero-to-finish in 2022. This will be an evolution now that the revolution has kicked in.

Finally, and you touched on this earlier a bit, but how can those who don’t have access to these big stages still use your product or still  get involved in VP?
Ncam’s core technology is not only for use on big stages. In fact, it was designed with complete flexibility in mind. We wanted to create a camera tracking system that would work anywhere on any camera. This means that ICVFX can be shot outdoors, without the need for a greenscreen or LED volume. We’ve also harnessed the power of Unreal Engine and include our lite version of the plugin on the Unreal Marketplace for free.

Additionally, our new pricing tiers allow anyone to enjoy real-time VFX on a project-driven basis, ensuring you only pay for what you use. This lowers the barrier to entry significantly and enables more access to the creative freedom of VP via Ncam.

Puget System’s Kelly Shipman

Puget Systems designs and builds high-performance, custom workstations with a specialization in multiple categories of content creation, including post production, virtual production, rendering and 3D design and animation.

Kelly Shipman

Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?
There isn’t one standard virtual production workflow. Instead, a few are based on asset creation or real-time playback, with slight variations for the specific hardware in use. For example, motion capture artists mostly have the same workflow, but with slight differences depending on which motion capture suit they are using and whether they want to edit the animations in a different animation package. Shooting on an LED volume is mostly standardized, with slight tweaks for the specific controller, camera and motion tracking system in use.

Can you talk about pain points related to VP workflows?
One of the biggest pain points is still the lack of standardized workflows with external hardware, as well as the lack of documentation on how to set everything up. Many people have put in lots of work figuring out how to get their own workflows executed, but there is not yet a plug-and-play-style product that someone new to the space can install and begin working with.

What tools are paramount to virtual production workflows?
The one tool that is fairly standardized is the use of Unreal Engine as the core of VP. There are a variety of other software or plugins that may be used, or there’s specialized hardware based on the specific task at hand. But Unreal Engine brings it all together.

What would surprise people about virtual production workflows?
Exactly how close the various teams will be working together. The old mantra of “fix it in post” truly doesn’t work with virtual production. When filming on an LED volume, there needs to be at least one Unreal Engine expert on-set to make any on-the-fly adjustments. If the director wants to change the position of the sun, for example, the lighting will need to be adjusted in the Unreal Engine scene, and they will need to work with the crew to make sure the stage lights match. When trying to blend the physical set with the digital set, both art departments will need to work hand in hand from the beginning, instead of one team doing its part and then handing it off to the next team to do theirs.

What about color accuracy in VP? Difficult to achieve?
The color accuracy of LED walls has been improving rapidly, as has the peak brightness. There is still room for improvement, but it is important to find out what the panels are capable of and do some test shots with the chosen camera. Most walls allow for calibrations, and Unreal Engine offers considerable calibration options for its final output.

What is the biggest value of VP?
The biggest value of VP is the immediacy it provides along the entire pipeline. Once environments are created in Unreal Engine, there is no waiting for a render to finish. The director can move the virtual camera through the scene to set up the shot using final-quality graphics. If on the day of shooting, something needs to be changed, those changes can be made, and the result appears immediately on the LED walls ready for filming. This allows for greater flexibility and experimentation as well as a unified vision since everyone on-set is seeing the same thing.

What is the biggest misconception?
Probably the biggest misconception is that the real-time engine is capable of the exact same quality as a traditional offline renderer, just faster. The truth is that to achieve real-time speeds, many sacrifices have been made. The trick is to understand how to properly optimize a project in Unreal Engine to get a balance of graphical quality and the desired frame rate. Using the LOD (Level of Detail) system and mipmaps, reducing texture sizes for nonessential objects, optimizing lighting, etc. can go a long way to improving the performance of a project without sacrificing the final output, which in turn allows you to put more effects on the screen.

Meptik’s Nick Rivero

Meptik is an Atlanta-based full-service virtual and extended reality production studio.

Nick Rivero

Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?
It’s still a bit of the Wild West overall. Workflows and systems are still being figured out, while the hardware underpinning it is evolving at light-speed. I can say that over the past two years in particular, there is more standardization coming to fruition, all the way down to naming methodologies.

However, the market as a whole still has a lot of disparity and people operating toward what works best for themselves and their specific on-set or studio needs. So overall, I believe it will take some time before we see any sort of rigid standards by which we all operate.

Can you talk about pain points related to VP workflows?
Complexity is still front and center. The technology behind VP is still complex and requires qualified technicians with deep experience in nuanced technologies. Within the Meptik staff, we have people that specialize in system design, system operation, camera tracking systems and more, just to name a few. We’ve worked with the Disguise platform for a while now because it simplifies a lot of the larger technical challenges for us, allowing us to get volumes up and running faster.

What tools are paramount to virtual production workflows?
Well-rounded knowledge of both video technology and 3D content creation workflows is paramount to understanding all aspects of this paradigm. Beyond that, it really depends on what you have an interest in. If on-set operation, programming and system technology are more interesting to you, then focusing on video engineering, LED technology, networking, camera tracking systems and similar items is the path to take. Otherwise, if you are interested in a more creative path — toward the virtual art department side — then study content creation, visual effects, 3D software such as Maya or Cinema 4D…and definitely understand Unreal Engine.

What would surprise people about virtual production workflows?
If you understand the basics of video technology, you can pick up the rest pretty quickly. While there are very deeply technical pieces, the high levels come together fairly quickly. Also, while VP is aimed at cinematography, it requires knowledge of more broadcast and live event-type workflows and technologies, such as LED, computers and GPUs, and video system signal flow that is typically found outside of the film ecosystem.

What about color accuracy in VP?
Color accuracy is one of the largest hurdles in the space. Whether you’re on a film set or shooting with a corporate client, and whether it’s a TV commercial or a music video, precise color representation is key. The industry is making strides in this direction, but right now different manufacturers and software platforms handle color differently, so there is a ways to go toward standardizing it across the industry.

What is the biggest value of VP? 
Virtual production allows for total control of the shooting environment. Time of day, weather, object positioning, locations — all can be adjusted as per the director’s requirements, mostly with the press of a button.

Instead of waiting to see what your shots look like after a lengthy post process, the process shifts to preproduction — meaning you can previsualize scenes and know exactly what to shoot before you step foot on-set. And on the set itself, you can see 95% of the final product. This results in not only enormous time savings and decreased cost of traveling but also more creative freedom — you are not bound to physical barriers anymore. VP provides you the freedom to shoot anywhere at any given time. The only limitation is your imagination.

What is the biggest misconception?
Filmmakers think they need to really understand the technology behind VP to make use of it, but that’s not true. As a full-service virtual and extended reality production studio, our team takes care of everything beyond the idea. We have creative and technical teams that help with ideation, creation and execution of ideas.

Now that we are part of the Disguise family, we have even more access to a global team of immersive production experts and the latest in tech to deliver groundbreaking studios and virtual environments that transport audiences into new worlds.

How can those who don’t have access to these big stages still get involved in VP?
We divide our offerings into three main pillars: bespoke, facilities and installs.

If you have access to a virtual production studio with a crew, then we can provide content or our technical expertise.

If you don’t have access to a virtual production studio, we have a turnkey, production-ready facility with all the staff you need at Arc Studios in Nashville with our partners at Gear Seven.

And if you are looking for a permanent installation of a virtual or XR production stage at your own facility for large amounts of content production, we install XR stages with the Disguise workflow.

Vū Studio’s Daniel Mallek

With locations in Tampa Bay, Nashville and Las Vegas, Vū is a growing network of virtual production studios providing work for commercials, corporate live streams and events as well as long-format film and episodics.

Daniel Mallek

Do you feel VP workflows are fairly standardized these days, or is everyone trying to figure it out on their own?

There is a lot of work happening to standardize virtual production workflows. Since education resources are limited, many have had to figure things out on their own and piece together tools from different industries that weren’t created specifically for virtual production.

Organizations such as SMPTE and Realtime Society are actively bringing together innovators from across the industry. One of the biggest challenges to standardization is the quick pace of innovation and how rapidly the technology evolves. The more filmmakers use this tool, the more we learn about what works, what doesn’t and what standards we need to implement to make each shoot a success. There’s work to be done, but the future is very exciting.

Can you talk about pain points related to VP workflows?
The biggest pain point is a lack of education. Essentially, virtual production (ICVFX) combines several industries together (film and TV, gaming, live events, etc.). To this day, there is no clear path for someone who wants to enter the virtual production space. Stages are being built faster than we can find people to operate them, which is why at Vū, education is one of our primary objectives. Our goal is to lower the barrier of entry to this creative technology.

What tools are paramount to virtual production workflows?
Virtual production, specifically ICVFX, brings together tools from multiple industries. This includes real-time camera tracking, real-time rendering in a game engine, high-end computing, LED processing and physical LED walls. Within each of these tools, there are multiple additional technologies at play. What this means is that for a stage to operate smoothly, each of those items needs to be well-optimized and in good working order. If not, it can cause issues in the workflow that can be difficult to troubleshoot when something goes wrong. At Vū, creating and operating these systems on behalf of our clients is core to our business.

What would surprise people about virtual production workflows?
It’s complex to build out a system, but once a stage is optimized and online, the operation is relatively straightforward and can be operated by one person, depending on the production’s needs. Our vision is that anyone should be able to easily operate a stage and manipulate an environment, including directors and DPs. We still have work to do to accomplish this vision but are getting closer every day.

What about color accuracy in VP? Difficult to achieve?
This is one of the primary concerns that DPs and artists have. In the early days of ICVFX, it was definitely a barrier to entry, but the technology has come very far since then. By combining high-end LEDs with premium image processing, it’s now the standard to offer highly accurate color that can be manipulated and finely tuned depending on a production’s needs.

What is the biggest value of VP?
The biggest benefit we hear about often is the level of creative control that the technology unlocks. The limitations of time and location are gone. In practice, this means a production can shoot in a remote location without having to organize cost-prohibitive logistics to bring a large crew there. It also means that long and expensive post pipelines are drastically reduced since most large-scale effects are captured in-camera. This all allows creatives to extend their budget in ways that aren’t possible with traditional tools.

What is the biggest misconception?
That you need to have “the right project” or lots of VP experience to bring a story into one of our stages. While there are certainly aspects that are different from shooting on location, the learning curve is much less than people realize.

For example, if you know how to light a scene on-location, that knowledge will transfer to virtual production. It’s also great for all types of projects, from feature films to talking heads. It’s up to the filmmakers and creators to find the best way to use this tool to accomplish their vision.

Main Image: HaZ’s Rift


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 25 years. 


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.