NBCUni 9.5.23

Virtual Production Workflows: House of the Dragon and Coalescence

By Alyssa Heater

Virtual production allows content creators to realize their creative visions earlier in the production process, opening a new world of creative possibilities. While this process feels like the Wild West for many, thought leaders in the space are working toward making this method of production more seamless and one that can be used for productions of any size and budget.

We spoke with Phil Galler, co-founder of Lux Machina and CTO of NEP Virtual Studios, about his work on HBO’s hit series House of the Dragon as well as the challenges and rewards of collaborating on a large-budget series with multiple visual effects and post vendors.

We then sat down with AJ Wedding, co-founder of Orbital Virtual Studios, to discuss his collaboration with Avatar and Star Trek creature designer Neville Page on the sci-fi short Coalescence. Together with other creative minds in the virtual production arena, they produced the piece as an exploration tool to determine ways to create a better pipeline.

Phil Galler: House of the Dragon

Phil Galler

How early in the preproduction stage did Lux Machina become involved in House of the Dragon?
We were in early talks at the end of 2020, then things heated up in the beginning of 2021. By January, we had all agreed that we were going to build a stage in London and started designing a solution that fit into the ecosystem of the studio. This involved weekly meetings between us, production, Pixomondo, The Third Floor and other vendors, then shooting began in Q3/Q4 of 2021.

How do those early conversations usually go? And who are typically the key decision-makers that bring a project to life from the beginning?
In the very beginning, conversations involve production design, a VFX supervisor, a virtual production supervisor, the director and the cinematographer. They determine how best to execute the creative and help lay out parameters. How many environments are we going to have? What does the schedule look like? How are we going to creatively tackle larger problems, such as the use of flames inside of an LED volume? And then they talk about health, safety and security issues. These are the things that need to be addressed upfront.

House of the Dragon

From there, it goes into a couple different paths: one that is very technical and one that is very creative. From the technical, you start to have the involvement of color scientists to prepare for and help resolve any issues around making the content look correct in-camera, on-screen and in post. And on the creative side, you have previz artists and assets being handed off. Then there is this channel that sits in the middle between these two worlds: those who help bring the creative to life through the lens of what technology is available. How do we manage all these assets? How do we handle security onstage? How do we deal with building out the software tools that are needed for a specific production? There are almost always a couple different prongs, and I split them into the technical and the creative and then what bridges the gap between them.

Tell us about the LED volume (or volumes) that were built for the House of the Dragon set.
The stage was built from the ground up at Warner Bros., with state-of-the-art cooling, a dedicated machine room, dedicated power as well as backup power so if there was a failure at any time, we could switch over to an outport option for power and a fire-suppression system — pretty much everything you’d want in a stage.

House of the Dragon

It was approximately 24 feet tall, 80 feet long and about 70 feet wide. It comprised thousands of LED tiles from ROE Black Pearl 2 and ROE Carbon 5 — 2mm and 5mm tiles, respectively.

In terms of equipment, it was driven by 50 or so computers, all managing different components — some dedicated to motion capture, some dedicated to rendering, some dedicated to compositing and everything in between. We had a machine room full of video routing and infrastructure, around 90 motion capture cameras, and moving plugs that could move in and out to enclose the volume. It was all provided by VSS and rigged by DLM. We had pretty much all the bells and whistles.

Will you share a bit about the software and the game engine technology used on this project?
While Lux doesn’t exclusively work with any one technology – we like to say we are agnostic — most of the stages that we build are powered by the Unreal Engine because that is the tool that is the farthest along in providing solutions for filmmakers. At the stage, we have two of what we call “render clusters,” which are systems that render imagery to the LED walls powered by Unreal. We have a control layer of software that we developed as well as some of our own camera-tracking equipment. It is a tightly run ecosystem of supplied parts and created solutions all running in real time.

Were greenscreens used at all on this series? Or has that gone away and you are just using LED volumes?
These things are all tools in a tool belt, and I don’t think any one of them ever goes away. We might offset the use of one with a better tool, and that is the case here. We used bluescreen — both physical bluescreen and bluescreen on the LED wall — to allow for compositing later. This combination of techniques, where you fill certain patches of the LED wall with a color that could be keyed out later, works really well.

House of the Dragon

Even when we aren’t using physical bluescreen and greenscreen, that doesn’t mean we don’t need to do compositing. The LED wall is not a perfect solution; it’s another tool in the tool belt. Lux’s tagline is “technology empowering creativity.” I like to think that encompasses all these solutions.

It must be a completely different experience for a cinematographer, shooting in a volume versus on-location. Please tell us a bit about the collaboration process with the DPs on the series.
Totally. From a very high level, everyone was really excited to be involved in the process. That is the first step — to make sure that people are interested in actually doing this work. That goes not just for House of the Dragon but for any show. You want to make sure that everyone feels comfortable and understands the same lingo, which requires sitting down and talking through things. We have multiple conversations about the tools at our disposal, and we conduct demos to understand how the tools work in different use cases.

We’ll sit in the volume together and work on prelight setups, sometimes weeks in advance — and the DP is really driving it. Virtual production sits in this weird place between visual effects and physical production. It seems so visual effects-intensive —rendering, computers and content — but at the end of the day, it’s been captured in-camera. We want the DP to be happy with the way it looks, and from there we need to determine what tools they need. Often, the tools don’t yet exist, and we need to build them.

House of the Dragon

From a process point of view, those are the things we’re trying to answer in these high-level conversations. That’s what that collaboration looks like — it’s a back-and-forth conversation about how color and light work onstage and how the DP wants to work with those tools.

What is the workflow process like between Lux, Pixomondo, MPC and any of the other visual effects or post vendors, having multiple companies working on the same project? How do you achieve the desired result by collaborating with each other?
Galler: That’s a good question and indicative of how complex some of these workflows can be. MPC creates beautiful dragons, but the dragons cannot be created in real time, so all of MPC’s work was done in the post process, and we didn’t interface with them directly. Their work funneled into Pixomondo, who created Dragonstone Bridge, the throne room, etc. Working very closely with Josh Kerekes at Pixomondo, we determined ways to collaborate and share data, manage assets and work together on the stage.

That’s really the first order of businesses: How do you work together with a VFX vendor or another vendor to determine who does what? Once you get into the rhythm of things, it becomes streamlined. A Lux supervisor oversaw the virtual production, while visual effect supervisors from Pixomondo and the studio oversaw the VFX. They collaborated to come up with the right solutions for any given problem onstage.

House of the Dragon VFX

Pixomondo was in charge of making changes to the individual assets. Then on the stage, we’re responsible for integrating those assets. If they need to change the material on an asset, they do it on their computer, then we take that change and blend it into the environment so it integrates properly with the lighting that the DP has requested. It’s an interconnected and sometimes complicated web, but once everyone understands their goals, there’s not really any conflict. I would view it as working as one cohesive unit.

Were there any sequences that were particularly challenging or rewarding to create?
The throne room was uniquely challenging to create inside the LED volume because it’s rare to have that much fire, and it truly created a sense of depth. It was cool to be involved in the execution of that. Additionally, Dragonstone Bridge was also really cool because we unlocked so many possibilities that were impossible to achieve the first time. The show in and of itself is limited because of the original Game of Thrones environments, having to go to a location, being outdoors in the sunlight, being near the ocean in a remote location. There are all these limits that I think made this series of scenes perfect for volume work.

You want to put a Spidercam up and fly it around the volume? You can’t do that outside over the ocean without significant complexity and cost. You want to sit at a certain time of day and tweak the clouds until you’re happy with the way it looks? You can’t do that in a real world. So it was the perfect combination of things. It was one of the locations that I think was most iconic from the original series. So having to make it look the way people expected it to look was a huge challenge, and that was very rewarding.

House of the Dragon has a huge budget — I read it’s close to $20 million per episode. How do you feel virtual production technology can be used for smaller budget or indie productions?
I wouldn’t let the price deter anyone. Virtual production should be looked at as a spectrum — the work is actually done at significantly lower-budget numbers, especially if we’re including plate or projection work. Most of the work that exists is actually plate work. Every show has a vehicle in it, and every vehicle should be considered as a potential use case for driving plate work or 2D playback of environments. Or maybe you’re shooting a comedy inside a house, and you need to show different environments outside the back of the house, so you can put up a projection screen.

House of the Dragon

There are many ways to use the technology from a more affordable standpoint. I always tell people not to let the big House of the Dragon numbers get in the way of the fact that [this show] might not be saving money using the technique. Other productions can if they are willing to do a couple of simple things. For example, let’s say you’ve got 50 shots in a car. Doing that on greenscreen versus doing it in LED is about 16% more expensive. You actually save money going to LED or projection when you’re looking to do process work. People generally don’t realize that, and I think a lot of people get caught in this hype of “I have to do House of the Dragon, or I have to do The Mandalorian.”

The reality is that it’s better to start thinking on the other end of the spectrum, which is, “How do I use this technology in the smallest, most affordable way possible to help me?” I always tell people to start small because it offers a couple of benefits: one is the cost, but more importantly, it introduces your crew and your creatives to the technology. Start introducing them to these ideas early on in small ways. “Oh, we’re going to do a little bit of sim-cam on this shot so that we know what that environment out that window looks like.” That’s relatively affordable. “Let’s just put up a projection screen outside the back of this car so that we have something to look at in the rear mirror.” When you start small and begin to explore, I think people understand that the world is their oyster.

That makes sense, and I’m sure it saves quite a bit of money in post as well?
Yes, and time too. If it can be captured in-camera, we don’t necessarily have to go to post. We just finished a feature that did a couple thousand shots in-camera. The budget was around $75 million, which is a considerable budget, but to do 2,000 VFX shots would alone be almost a third of the budget. By capturing more in-camera, they were able to complete the project with far more creative possibility than they could have afforded in any other way, and I think it’s through that lens that people should be looking at saving money.

AJ Wedding: Coalescence

AJ Wedding

Let’s dive into Coalescence. How did Orbital become involved in this project?
Coalescence director Neville Page [who is also a concept designer/illustrator] is one of our very close filmmaking friends. In this industry, you gather people around you who are like-minded and want to make something cool happen. I got this opportunity to build up a virtual production studio and asked Neville, “What do you want to do?” So he came up with the idea, and then we started brainstorming with Emir Cerman from ROTU Entertainment, our virtual art department.

One of the challenges in taking on virtual production as a business is the pipeline. Big visual effects houses have thousands of artists, but most often they are not trained in Unreal. Often, they are building in other software and then transferring into Unreal, and it can sometimes take five to 10 times as long. That pipeline is just not working, and that’s one of the things that we are working to change.

We work with people like Emir from ROTU and Felix [Jorge] from Narwal Studios, who have a ton of experience and know what has worked and what hasn’t worked in the past. As a larger group, we are trying to figure out a better, more collaborative way that doesn’t allow assets to sit in a black box for too long. That seems to be where things go awry.

So that was the start: Let’s figure out a way to create a better pipeline, and we can use Coalescence as a base. Neville had the great idea for it — it’s a sci-fi film that takes place during World War I.

Coalescence

He started creating some initial designs and would send them to Emir, who would then create a digital version. Then he would physically build a wall, and they would scan it and put it in. 
Neville would see it and then provide his feedback. Between physical builds, digital builds and scanning, it culminated into this whole production design sandwich. And I don’t know that we’ve ever done anything that was this successful in terms of blurring the line along the wall. Most people who see it cannot tell me where the wall is. And I love that because you shouldn’t know where it is.

Tell me about the volume used for Coalescence.
Wedding: When I decided to get into virtual production, I did a lot of research and was fortunate to have friends who worked on The Mandalorian, which was the only show using this technology at the time. They expressed their concerns, a lot of which had to do with the technology that was present at the time. My thought was, instead of continuing to use what they used, let’s figure out a better plan and figure out how to solve those problems.

For instance, you had to have the camera placed 15 to 18 feet away from the wall, or you’d get moiré. What’s the cause of that? It had to do with the pixel pitch. We found panels that had a tighter pixel pitch, so we could get the camera within 3 feet of the wall and have much more usable space in the volume. Also, the tighter pixel pitch allows us to track focus to the wall, which is something you just can’t do with a pixel pitch above 2.5 or 2.6. Ours are a 1.5-millimeter pixel pitch. We drive them at twice the rate of other screens, so we drive them at 120 hertz, allowing us to do high-speed photography.

Coalescence

Coalescence

Everybody who works at Orbital is a filmmaker, DP, director, etc., first and foremost. We didn’t approach this from a technology perspective; we came at it and asked, “How do we make this better for us? How do we get rid of all these negatives and make it a great tool?” Everybody had their own unique perspective. One of our collaborators, Mark Poletti, is an actor and a stunt coordinator, and he wanted to be able to move the camera really fast, which is hard to do when you have latency. We’ve figured that out, and now we have the lowest latency in the industry. The goal was to create this tool that filmmakers can walk into and just be filmmakers rather than being given a list of all the things they can’t do.

The volume itself is about 50 feet in diameter and 25 feet deep, so it’s a half circle, and it’s 16 feet tall. When you look at this production, it looks like a much bigger space. It really comes down to how much usable space there is in the volume. If I have to be 15 feet away from the wall at all times, then I can only use a space of about 15 feet in the center with the camera. Because I can use the entire volume, it looks twice the size. Ours is probably half the size of The Mandalorian volume, but we have the same amount of working space. Ultimately, it saves money on the art department because, for instance, I only need 5,000 pounds of sand instead of 10,000 pounds of sand to cover my stage.

Neville’s past work includes Avatar and Star Trek, which take place in completely imagined worlds. Coalescence has more historical accuracy, guns and a bunker. How does shooting a project like this differ from shooting a project that’s completely from the imagination?
If you’re creating something in science fiction, everybody knows it’s not real, so there’s less of an expectation. But when you’re talking about wood and mud, people know what those things look like. We were really intricate with that process, especially because the particular version of Unreal that was available at the time was about half as good as what’s available now… just six months later. To get the fidelity out of it, we had to do a lot of interesting tricks.

Coalescence

People who have done virtual production with higher pixel-pitch screens will tell you to keep it out of focus or put as much smoke in front of it as you can — and they’re not wrong, considering what they were working with. Now we’re at a place where we’re as good as the artist is, so we can put things up on the wall that will trick your eye. Now we can actually track focus into the scene, and because we have that ability, it has to be perfect. So there was a lot of work that went into that. The nice thing was that because we were doing another show off-site, we had plenty of time here to test it and really get it down and make it look perfect.

How do you prepare for sound when shooting in an LED volume? Do you have to do anything differently on the volume?
Sound can be a big challenge on-set. We had a great sound mixer who not only helped with the basic issues that come with a volume, but he would put little speakers in different places and play ambient noise so the actors had extra sounds to motivate them — gunshots and things like that. With an LED volume, especially a curved wall, you’re always going to get early reflections. In many cases, you’re pulling in rolling panels to help diffuse sound. In this case, because our set was made of wood walls, it blocked most of the sound away from having those bad reflections, so we didn’t really have a problem.

Orbital stage

At Orbital, one of the solutions we offer is two different kinds of stages. If, for whatever reason, we are not able to stop reflections, we have another stage that is more modular. Right now, we have a 20-foot by 60-foot flat wall, and it helps with some of the problems with sound. Additionally, sometimes if you’re shooting a very high-contrast scene, the light being generated by one of the LED panels will light the other panels. You start to see where the panels meet each other because it’s not an actual curve; they are all flat panels. You start seeing those lines, and very often that’s a tough thing to flag the light away from without ruining your scene. In early testing, we decide whether it will work better on a flat wall or curved wall.

Tell me about the cinematography and lighting setups for Coalescence.
If I can get one piece of knowledge out to the world, it’s this: When they say you can light the wall, ask any DP the lowest CRI light they will allow on-set, and I guarantee they’re going to say something like 85, while the CRI of an RGB LED panel is 13. This means you’ll have many gaps in the color spectrum, and you’re not going to get great skin tones. People in post are not going to be happy when you show up if you lit with LEDs.

We explain it like this: You’re trying to tell this camera sensor that’s pointed at a light that it’s not a light; it’s a universe, it’s a set, it’s a stage, it’s a place. Don’t use it as a light, use it as that, and then bring in lights the way you normally would shoot any scene. How we light properly is something that in-house DP Leo Jaramillo has been integral in figuring out. How do we get the best that we can out of it? One of those aspects is calibration.

You have a camera sensor (they’re all different), and the sensor is looking for specific information from anything in the scene to tell it what color it is. The sensor measures that in nanometers. If the sensor sees a certain number of nanometers, it equates to a certain color. But when you’re pointed at an LED screen, there are so many gaps in the color spectrum that sometimes the sensor is going to pick up the wrong color. If you are lighting with the wall, it may affect the color of the real-world item. If you have a big LED ceiling, there are times when people might be wearing yellow, but in the monitor, it looks orange.

It’s important to calibrate the wall to the actual camera sensor because it’s not going to give you perfect color information. But you can make it so that it’s closer and so that the sensor on the camera and the LEDs have a better handshake than they do straight out of the box. It’s the same thing with the LED lights. We use HS Scope to calibrate all three together — that’s the best tool right now. Within the next decade, we’re going to get a higher CRI, which we’re already starting to see for the LED walls. It’s just that they’re incredibly expensive, and the refresh rates are still too low, but eventually it will be a perfect system.

I think a lot of creators have this expectation that virtual production is too expensive to use for their production. How would this technology benefit a lower-budget indie project?
Anytime you’re doing something indie, you have to be extremely prepared. Every day is a battle, and if something slows you down, it could mean the end of your production because you don’t have a buffer of money. By using all the previz and preproduction tools that are available, you can greatly increase the number of shots that you’re capable of on any given day.

Orbital stage

As far as the LED part of it, I think for a lot of indies, there is this idea that “We’ll just do a greenscreen and figure it out later.” I’ve been involved in many productions with that mentality, and in the end, they realize their mistake. It’s going to take a lot of time to recoup, and it’s still not going to look right.

The best thing about virtual production with LED walls is that you’re getting the final shot in-camera, and the DP can see it and light it accordingly, and it just matches better. You don’t have to go into a big volume; you can rent a smaller screen that’s specific to whatever you’re doing. All of our screens are modular, so we’ve done productions where we just send out a 10×20 screen or even just something to go outside of a window. There are a lot of ways in which lower-budget productions can use it.

Snowfall is not a super-expensive TV show, and we’ve been quoting some $5 million features. Snowfall has a few scenes that make the most sense using virtual production. The other thing they always talk about is how the independent feature has the biggest bang for the buck when you make it in one room. Well, now you’re one room can be anywhere. If we’re not moving, and I can change the set for you 20 times, it can look like a much bigger movie.


Alyssa Heater is a writer working in the entertainment industry. When not writing, you can find her front row at heavy metal shows or remodeling her cabin in the San Gabriel Mountains.


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.