NBCUni 9.5.23

David Stump and Jim Geduldick: Shooting Virtual Productions

By Alyssa Heater

From a cinematography perspective, shooting a virtual production requires in-depth knowledge and a passion to explore the evolving real-time technology. We spoke with two cinematographers equipped with vast experience in the virtual production arena — David Stump, ASC, and James Geduldick — to learn their insights into shooting on an LED volume, the cameras and technology, visual effects, the lighting set-ups, and the importance of coming in prepared.

David Stump

You have vast experience in both cinematography and visual effects. How do you balance being a DP and a VFX supervisor? And how did your background lead to working on virtual productions?
Anything new and technical — workflows and technologies to accomplish shots that people haven’t seen before — is interesting to me. I have been doing virtual cinematography since around 2003. It was so far ahead of its time that it took this long to get caught up. In 2003, I shot a movie for Randal Kleiser called Red Riding Hood, which was shot all blue- and greenscreen. It was all tracked by a motion tracking system that I hand-built so we could locate the camera and match CG moves. Because there was no virtual engine at that time, we used the output of the tracking system to drive Maya. While we weren’t doing full-on compositing on-stage, we were visualizing comp shots and recording what we would call a slap-comp on-stage for reference for editorial and for VFX framing.

David Stump

It’s been a natural progression into virtual technology for me. Eliot Mack had a technology called Lightcraft a few years ago, and I used it multiple times for very effective shooting just before extended reality and LED wall shooting broke out. This is not a new technology for me, and it’s a natural place to turn my energy. Regarding my two parallel careers, one as a cinematographer and the other as a visual effects cinematographer/supervisor, it’s hard to disguise that I do both things, but I try to, because Hollywood dearly loves a pigeonhole. So as much as I can, I try to deemphasize my history. Your resume, I think, can be a list of your disqualifications.

From a cinematography perspective, what types of projects do you feel would benefit most from shooting in virtual production, whether from time, resources or cost savings?
I’ve come up with a concept of how we write all of these science-fiction movies about a time machine that gives us the ability to travel anywhere in time. Virtual production technology is a bit of a time machine, but it’s really more what I call a “place machine,” because you can be any place and you can be there in the studio, and you can be in a different place in three hours. And be completely around the world from where you are all on the same stage.

Are there certain types of projects that benefit more from virtual production, or do you think any kind of project would do well with it?
I wouldn’t say any kind of project would do well with it because there are still a lot of things that just look better, are more efficient and more economical to do just by going somewhere and plopping a camera down and shooting it.

At the same time, budgets and schedules are getting tighter and expectations are getting higher. It is incumbent on cinematographers and VFX artists to learn and understand how to use this technology. It’s another tool in the box, and it can get you things that would be really difficult to get any other way. You really need to know the technique and the technology because it’s not one-size-fits-all. It’s not for every shot, and it’s not for every movie.

Planning is critical in virtual production because expensive shoot days are not the time for making mistakes. Tell us a little bit about the planning process for a virtual production.
You must have it planned to a T. All of the scenes, artwork and backgrounds have to be ready to go on the day because if you show up on a virtual stage and start putting stuff up on the LED wall and it’s not ready, you’ve got nothing to point the camera at. Prep is extremely important in virtual shooting and not just previz.

I just did a show in Rome last year where we did storyboards, previz and techviz, we then redid the techviz on a virtual system, and we tested it six ways from Sunday for almost a year before we ever took anything on stage. Even then, we ended up not using an LED wall and just using the technique of LED walls to do live-action visual effects. We just didn’t have the LED wall behind the objects. It’s hard to wrap your head around, but instead of putting up an LED wall, we came up with a notion that we’ve now nicknamed “similar screen.” It’s not bluescreen and it’s not greenscreen, it’s shooting against a background that is similar to what is eventually going to be in the composited shot.

Comandante

This is going to increase in usage as AI starts to emerge in visual effects. One of the things that people are beginning to point AI at right now is rotoscoping or hand-cutting out people or things from backgrounds. Rotoscoping has traditionally been extremely labor-intensive and unrewarding, and it’s difficult to get people who want to do it for a living just because it’s such a boring thing to do. AI holds the promise to turn that into a push a button and come back in 40 minutes and see it completed.

When prepping for a virtual production, which people from the team do you typically have in those early conversations?
You need the production designer and art director, as well as the virtual art department (VAD) with their own designers and art directors. Your virtual supervisor and director need to be involved. The cinematographer should be there because, frankly, nobody else on a production is as good at synthesizing composition and telling the story with camera movement and framing. In order to get all of those people participating in the same process, there is a very important factor that I call buy-in. And that happens, especially at the producer-level and at the director-level, and to a degree at the cinematographer-level, people need to buy into the process. And if you start off trying to win over those who are skeptical of the process, you will be rolling the rock up the hill for the rest of the show.

What does the typical lighting setup look like for a virtual production, and how do you navigate shooting with the addition of the light coming from the LED screen?
The first thing that I usually tell everyone is that you don’t count on the screen to light your subjects. You count on the screen to provide reflections and a background, but the light from the screen is not good at lighting human faces. So if you start the exercise thinking that you’re going to use nothing but the screen to light your foreground subjects, it is not a good idea. I tell people the same thing about LED and virtual production that I tell them about bluescreen and greenscreen, which is to treat what’s happening in the foreground as one lighting setup and what’s happening on the screen as another lighting setup. They just happen to be in the same place.

Comandante

You have to flag your foreground lighting off of the screen as much as possible, or you just wash out all of the blacks in your image and they turn gray. Similarly, if you light up the entire semicircle of an LED wall and you’re only using 20 feet of it in the middle and you’re not creating reflections with it, then you’re just turning your actors beet red, and you’re fighting that the whole day long.

The walls are improving now. We’re getting RGBW, so we’re getting white component walls, but by and large, most of the LED walls being used are creating color from narrow spectrum RGB primaries. If you break down what’s coming from the wall, you believe that you’re seeing yellow on the screen. When you analyze it, you are not actually seeing the color yellow; you are seeing a combination of red and green that tricks your eye into thinking you’re seeing yellow.

And if you think you’re seeing cyan coming from the screen and you analyze the spectrum, what you’ll discover is you’re not actually seeing cyan-colored light. You are seeing a combination of green and blue that tricks your brain into thinking you’re seeing cyan.

Someone said to me that whatever you’re seeing on the stage doesn’t really matter. What matters is what you’re seeing on the camera.
That’s exactly right. It won’t look the same in both ways. It is a trick of narrow spectrum RGB primaries on RGBD cameras. Blue, green, red, cyan, magenta and yellow are the actual spectrum that creates that color on the LED wall. And you can see red, green and blue are created, but less than obvious is how cyan, magenta or yellow are created. If you break down what you’re seeing on the LED wall, the color yellow for instance, you’re really seeing an odd combination of green and red. The camera sees those colors and your eye sees those colors, but when those colors reflect off a human face, they don’t come out looking like that when they reflect off of objects.

Tell us about the cameras and lenses you would recommend using when shooting for virtual production.
There is really no restriction in terms of cameras and lenses for shooting in front of an LED wall. A major thing to watch out for is trying to shoot an LED wall with anamorphic lenses. And if you are trying to match things, physical objects in the foreground to Unreal-generated objects in the wall, and you’re doing big camera moves, it is difficult to map anamorphic lenses to the engine just because they are optically complex and they have two different entrance pupils: one in the horizontal domain and one in the vertical domain.

When you align the lens for zero parallax on the system, you might get zero parallax, but then when you tilt, you’ll get parallax or vice versa. When you align it for the vertical axis and you tilt, you can get no parallax and then you pan and you get parallax. That’s because the anamorphic lens has two entrance pupils. The show that I did last year in Rome, we went to great lengths to map those lenses and to understand them and to do dynamic correction to the Unreal engine in real time so that we could pan and tilt correctly across Unreal Engine objects. And it is not trivial.

Comandante

Are there any techniques or technology you would recommend to a beginner in this space?
There are so many things to encounter in a virtual production. Understanding that you need to light the foreground subjects separately from the wall is very important. Understanding that you need to prepare everything that is going to go on the wall well in advance of going onto an LED wall stage is vital.

There is another factor that’s widely underappreciated, and that is when you do VFX work, you iterate. So you’ll shoot a greenscreen and you’ll see a test comp and you’ll comment and you’ll do another iteration and the director will comment and you’ll do another iteration. You just sort of sneak up on the final effect by virtue of multiple iterations. When you are on stage shooting LED wall production, you get one iteration. The cinematographer then also becomes the lead compositor. And you have to play a game of highly critical professional, what’s wrong with this picture?

How do you solve that issue?
By being aware of it for strategy. In our business, and any business really, it’s the unknown that bites you. The things that you didn’t know about doing. What’s amazing now is that the internet, especially YouTube, is such an amazing resource. Just by looking things up online, someone who’s starting out in our business today can get a leg up.

After you have finished shooting a project and you’re moving into the post process, how do you collaborate with the colorist or VFX artist to achieve the desired final look?
Whether or not extended reality production streamlines the post process depends on how successful you were on-stage. If you have to go back and do something over, you haven’t really saved anything. In fact, you’ve probably, in some cases, cost yourself more. The craft of visual effects supervision sort of dictates that. The process of iterating versions of shots with the CG artists, the reviews and sessions, where you do circles and arrows and explain what to do different physically or pictorially in a composite. And that process of iterating is now a tried-and-true process that goes on as long as you can afford and as long as the schedule allows generally, and it’s usually one of those two that cuts the process short. As Da Vinci said, a work of art is never finished, only abandoned.

Comandante

Have you worked on any recent or upcoming virtual productions that we can keep an eye out for?
Yes. While not an LED wall production, even though it started off to be one, it ended up being a virtual screen production called Comandante. It’s an Italian submarine movie and we used all of the techniques of virtual production, and we evolved new technique inside of that technique, including not shooting with an LED wall, but rather shooting with a background similar to what was eventually going to be the background in the finished composites.

We used a tool in Nuke for compositing called Copycat, and we did rotoscope and copycat rotoscope, and we learned a lot about compositing without using bluescreen nor greenscreen, or an LED wall behind the actors. We also completely engineered an optical workflow that would allow us to shoot with anamorphic lenses and a camera that would drive the Unreal engine and then distort the output of the Unreal Engine to double expose over the live feed of the camera with the same distortion. This allowed us to visualize a slap comp for the director in real-time or near real-time on set. We called the technique near real-time production (NRT).

Jim Geduldick 

Your career has spanned many different roles, including cinematographer, virtual production supervisor, VFX supervisor and beyond. How did your diverse work background lead to an interest in working in virtual production?
I come from camera, post and VFX, so I have a good worldview of entire pipelines. In digital and realtime, we are able to see our images right away; we’re not waiting to develop footage — unless we’re shooting IMAX or 35mm and Super 16. Virtual production is not new, but what we’ve seen is that the technology has matured to where we can get real-time images from the camera and the computer up onto an LED volume or projection.

Jim Geduldick

You could say one of the pioneers of early virtual production is Alfred Hitchcock. If you look at The Birds, or any of his rear projection projects, that was early virtual production. I think the misnomer is that virtual production is a blanket term for creative and technology disciplines coming together. Virtual production is still production at its core, using many of the same roles: director, DP, grips, gaffers and production designers. We are just now bringing these real-time tools into play.

Unreal Engine has matured enough that it allows us to use this technology all together. I’m at a point in my career where I’m finding new technology and discovering other applications outside of the entertainment industry, whether it be machine learning tools or robotics or computer vision. It all has a place within storytelling. These tools are just paintbrushes for creatives to use no matter what their role is in the content creation process.

How critical is it to bring the DP into prepro conversations?
I try to get the camera team in as early as possible, because a big thing in virtual production is the language and the teams’ correspondence. The camera, lighting and art teams are key because so much of the technology that we need to adhere to is going onto the camera. Things like FIZ boxes (focus, iris, zoom), tracking rovers and Sputniks (IMU/passive markers) for tracking have to go on the camera. There is a dance that happens between the virtual production team and camera team, so having that language makes it much easier and more fluid to understand. I don’t want to say there are restrictions when it comes to shooting in-camera visual effects, but so much is based off of camera placement, so the effect may break if a camera is placed in a certain way or if the camera is moved too fast.

A big part of my role working with cinematographers and the other traditional production departments is to help develop that language and technology breakdown. Virtual production is very technical, and sometimes the crew has varying experience shooting this way, so it is important to have that language to ensure that communication is understood by all departments.

Pinnochio

The virtual production supervisor or VFX supervisor role is typically means spending a great deal of time with the DP, so whatever familiarity they have with VFX or virtual production from the script breakdowns to virtual art department to pre-light to on-set to even post, is key. It’s important to include the cinematographer and the other department heads in these early conversations because decisions definitely have an earlier move upfront. They always say in virtual production that the buckets of budget money move up because the decision-making is moving up too, which impacts the shoot days.

 

Planning is imperative in virtual production because you don’t want to experiment or make mistakes on a very expensive shoot day. Can you talk about that process?
Every little thing takes prep, prep and prep. You also need strong communication between the virtual production team, the cinematographer and their camera team, because the camera is the main capture source and what’s pointing at the LED wall. There is a great deal of technology that goes into making sure that what the camera is seeing is right. When it comes to things like color, calibration and camera tracking, these are all things that inherently require strong communication between the virtual production team and the camera teams so everything is properly synced. Communication is probably even more important when there is both virtual production and traditional VFX because the shot on the LED wall could turn into something that also has an implication on dailies, post, editorial and VFX. Just like anything, if you pre-plan it, you’re going to be prepared for some of the issues that may come up.

Jim Geduldick on Pinnochio

As far as experimentation, there are some misconceptions that I want to help dispel. The myth is that virtual production is a box, and you can’t go outside that box creatively. This is not true at all. Some directors and DPs who haven’t used it or haven’t worked on a VFX-heavy show might think virtual production is restrictive if they are being told they can’t do this or that. You can get that creative and experimental time on-set, it just needs to be factored into the early planning rounds — the previz, tech-viz and the virtual art department rounds. If you’ve created your assets for the LED volume — computer generated versus captured as a 2D or 2.5D plate — the whole benefit is that your virtual props, or your assets on the LED wall, are created in Unreal Engine. The pliability allows you to change time of day or virtual props, such as trees, the ground or buildings, as an example. If you plan ahead, you can have that time to make those changes very quickly.

From a cinematography perspective, what types of projects do you feel would benefit the most from shooting in virtual production, whether from time, resources or cost savings?
If it’s a feature, TV show or music video where you have scene changes and different environments, then virtual production is ideal. Being able to shoot multiple locations virtually at different times of day is a benefit. So is weather because we are always at the mercy of Mother Nature in terms of that. If you are scheduled to shoot during the winter in the south, you often have to break for rain or thunderstorms. But if you are shooting on an LED volume instead, you could scan your exterior location and shoot without interruption.

It also helps with the uniformity of lighting, regardless of time of day. There are countless scenarios where virtual production helps tremendously between talent availability and location restrictions. Maybe there is a place that you can’t shoot, but you can recreate it digitally either from a stylistic or a photoreal aspect. Scanning using photogrammetry and LiDAR techniques is a common practice in both VFX and virtual production. We create a digital twin of a location and can shoot on that longer than we would at an actual location. We can’t stop tourists and crowds, so this is how we can control the environment. Virtual production hits those various tiers — budget and travel restrictions, weather and time of day, and talent and their availability.

Muppets Mayhem

What cameras and lenses would you recommend using shooting in virtual production.
I would say that there are preferred cameras and lenses for features and preferred cameras and lenses for episodics. Red’s V-Raptors and Komodo Xs, the Sony Venice II and the ARRI Alexa LF and the 35. You want to make sure that your camera choices can support features like good dynamic range, genlock and timecode, multiple SDI ports and other professional features that cinematographers are going to use.

From the sense side, it will be choices between spherical and anamorphic lenses. The difference between the technology on the two is using analog and digital (smart lenses) and how the communicate lens data and characteristics to UE or other engines.
Benefits of using analog lenses is that there is a wider selection of choices and most cinematographers lens choices are analog. With digital capable lenses you have direct access to the lens data communicating to the camera via lens mount and metadata being injected into the RAW stream to card.
Manufacturer’s like Zeiss, Cooke, Panavision, Angeniex and Fujinon all have digital lens choices. When shooting analog lenses you need an intermediary solution like a FIZ box ( Focus, Iris, Zoom) to translate the lens data to your realtime engine of choice. Certain camera tracking solution also integrate lens data though hardware and software. Lens calibration is still a time consuming process.

FIZ box

Many DPs have their favorite lenses for a look they are trying to target, so what we have to do on the backend is calibrate the cameras, lenses and camera tracking all together to make the digital matches up to the physical. Anamorphic lenses are trickier than spherical lenses because they have different characteristics to them. A lot of people love the look and the lens characteristics of how light reacts with lens choice for a given look and feel on anamorphic.

We’ll typically have a conversation very early on in preproduction where the DP and director tell me what lenses and cameras they are interested in using, and I tell them the implications of choosing these products and what they need to look out for. If you have three camera bodies and a 12-lens set per camera body, that’s quite a lot to calibrate ahead of time. It’s important to fit calibrating the lenses, camera and camera tracking into the budget and prep time.

What does the typical lighting setup look like for a virtual production, and how do you navigate shooting with the addition of the light coming off the LED volume?
That is always tricky because this comes into play during what we call the “pre-light” or “blending days.” The DP and their gaffer would start by looking at the technical layout of an early design of the LED wall. By knowing what shots are planned for the volume days they can build a lighting schematic to decide what practice lights should be used alongside the LED tiles. They may want to have a ceiling for various factors of their lighting approach. It could be a prop, actors wardrobe or vehicle that has reflective properties. Using either a soft grid or LED tiles as a ceiling might be an option to achieve a full light wrap of reflections that would normally happen in the real world. That’s the nature of how light works — it wraps.

DP Craig Kief on Muppets Mayhem

You are going to get different lighting output and color rendition from cinema lights than LED tiles. You need to consider how lighting will react to human skin or on a prop or set, or reflect on a silver helmet or a car, so there are all these decisions in terms of what type of light is going to be used. That’s why it’s key to work with the DP, gaffer and the grip team looking where light placement will be or need to be rigged. There are many considerations for lighting, and what you can use to light the LED wall versus what you can’t use. A mix of practical lights and blending is key because they offer different technologies and outputs of the lighting conditions. When the actors and actresses, as well as set design are on-stage, you’ll need to blend everything so your foreground objects don’t look like they’re popping off the wall. So that’s a mixture of camera, color and lighting that you have to balance in those blending and pre-light days.

After you’ve finished shooting a project and you’re moving on to post, how do you collaborate with the colorist or VFX team to achieve the desired final look?
If you’re shooting in-camera VFX, the hope is that everything is captured in-camera. Of course, we know that images are always touched up in some way, either in dailies, post, visual effects, editorial or final color grade. There may be several people that touch the images after they’re captured on the stage. You need to have those conversations, especially if you’re working on a show that has multiple vendors. It’s a matter of getting all the department heads together, looking at the virtual production deliverables and how virtual production may impact any of these other aspects.

The same thing goes for any of the visual effects vendors because they all have different pipelines. It’s important to understand what virtual production teams are delivering to these different departments. It may even go back to the studio side; there might be a master document that is provided to all the vendors with aspect ratio, deliverables, and delivery instructions, LUTs, review tools. How I would do it if we were delivering to other vendors is discuss what the production pipeline looks like for post, VFX, and virtual production and understand what are our deliverables that we have to give to the main VFX house. It’s a rolling delivery of content that might be handed over virtual art department iterations or works in progress being delivered to a visual effects vendor because there might be some shots that are composited and need touch-up that the visual effects team would do. It is a lot to do, and I think that’s why some people feel it is daunting. But again, if we have these conversations as early as possible and come in prepared, we can either avoid issues or we can come up with a gameplan, so we have options to either pivot or be reactive so it’s not a huge hit to budget, personnel, or timeline.

Pinnochio

What are some recent virtual production projects you’ve worked on?
I just finished up a few great projects as virtual production supervisor. I was able to team up again with VFX supervisor Kevin Baillie, DP Don Burgess (ASC) and director Bob Zemeckis on his feature film Here, starring Tom Hanks and Robin Wright and based on Richard McGuire’s graphic novel. This is the same team I was able to work with on Bob’s version of Disney’s Pinnochio.

I also worked on The Muppets Mayhem series on Disney+ with DP Craig Kief and the team at The Muppets Studio. And then a fun series update of  Yo Gabba Gabbaland with directors Christian Jacobs and Scott Schultz, and producer Ritamarie Peruggi. This will be coming to Apple TV + in 2024. Also, I worked on Amazon MGM Studios’ You’re Cordially Invited, starring Will Ferrell and Reese Witherspoon. I worked with director Nicholas Stoller and DP John Guleserian.


Alyssa Heater is a writer and marketer in the entertainment industry. When not writing, you can find her front row at heavy metal shows or remodeling her cabin in the San Gabriel Mountains.


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.