By Steven Calcote
If you have the time, budget and access to spin up a production on an LED volume, then go for it! It’s amazing technology, and there’s a reason shows like The Mandalorian take our breath away. But if you’re like 99% of the production world, greenscreen is probably still the right answer for virtual production (VP). And it’s worth mentioning that VP may not always be the best solution. Sometimes a location shoot is still the best way to tell your story.
To be clear, when we say, “virtual production,” we mean real-time integration of live-tracked CG environments with a platform like Unreal Engine. As with LED-driven VP, we too aim to end our shoot day with the final product — often referred to as final pixels — in the can and ready for editorial. But for Butcher Bird shoots, we’re focused on real-time compositing with a greenscreen rather than a moving frustum on an LED wall.
Greenscreen
Over the past three years, across a wide range of shows — from Netflix’s weeklong live series Geeked Week to narrative and commercial projects — we as a creative content company have chosen greenscreen virtual production because of considerations that include infrastructure, flexibility, time, budget and multi-cam production.
Our stage was constructed seven years ago with a three-wall cyc, so we already have a great setup for a greenscreen volume. But even if we had started with video walls in mind, outfitting an equivalent amount of imaging space using LED tech would have required a massive infrastructure upgrade to accommodate power draw, truss support and maintenance. This doesn’t even take into account the LED processors and multiple panel replacements and upgrades, given that display technology operates on the same Moore’s Law schedule that drives computing.
Also, with how often Butcher Bird needs to turn over our stage from virtual production shows to full-set builds and back again, permanent LED walls simply wouldn’t give us the flexibility we need. They would, at the very least, significantly encroach on our shooting space for nonVP shows and make it impossible to pull around the sliding black drapes that turn our stage into a black box when needed. Finally, we find that it’s faster to jump into a show on greenscreen because we can skip past the color calibration, sync, moire and other troubleshooting associated with LED walls.
But perhaps most important for us, we need to be able to shoot with multiple cameras — as many as six or more for many of our live shows — which greenscreen can handle as long as we dedicate a game engine and tracking setup for each camera. At this point, most LED walls max out at two cameras with one frustum for each.
LED
There are core advantages to embracing an LED workflow. To name just a few: You won’t have to worry about correcting for green spill; avoiding shiny furniture, costumes and props; eradicating green from your real-world color palette; or maintaining a minimum lighting level to generate good keys.
With an LED volume, you can add atmospherics like real-world haze, rain or dust as well as colorful interactive lighting that would ruin a greenscreen composite. For example, if you’re shooting large reflective objects like cars, then plan to use LED walls from the start. And the big, beautiful virtual oceans of Netflix’s 1899 and HBO’s Our Flag Means Death wouldn’t have been feasible economically and aesthetically on greenscreen
But whichever capture volume setup you choose, you’ll still need to overcome virtual production’s key challenges: (A) making sure scenes are performing at a high enough frame rate to make real-time capture possible, (B) preparing both real and virtual environments well enough in advance to make sure they’re ready to shoot, (C) supporting a brand-new discipline of highly skilled technicians and artists, and (D) making sure that your VP choices don’t wag the dog when it comes to the story you want to tell.
Lines Blurring?
I’m happy to share a little production secret with you: We can still accomplish a number of perceived LED wall advantages on greenscreen by applying some additional hardware and software solutions. With the latest series of high-end Nvidia graphics cards and Unreal Engine 5 advances, we’ve started to add real-time atmospheric effects digitally that previously could have only been added practically.
Image-based LED lighting solutions from companies like Astera, Aputure and Quasar Science — using the same Unreal Engine files as the main show — enable us to add edge and foreground lighting interactions that further enhance the realism of our composites. And for situations with tricky reflective surfaces, we can deploy resources like an 82-inch consumer 4K LED screen to capture detailed close-up image interaction, a technique we recently used on a sci-fi short featuring an astronaut’s helmet visor reflecting an alien landscape.
What’s Ahead
Looking to the future, we couldn’t be more excited by the incredible pace of new storytelling technologies that will make virtual production even easier. Count on AI add-on apps to appear in every link in the chain — from Unreal Engine to automated image correction; compositing clean-up; real-time motion capture smoothing; and streaming software to make virtual production faster, cheaper, and more beautiful than ever before.
Steven Calcote is a partner and director at Butcher Bird Studios in Los Angeles.