A hybrid live/post technique is how writer/director Steven Calcote and showrunner Lillian Diaz-Przybyl approached Orbital Redux, an eight-episode adventure that was originally performed and broadcast live, featuring studio musicians, real-time special effects, multiple cameras, live switching, audience interactivity and more. The story follows a former astronaut who is tasked with teaching a new pilot the ropes of the space program.
Now, following a year of post production, Butcher Bird Studios has released the final version of this series on the sci-fi channel Dust. We recently caught up with the creators to find out why they took their unusual production/post approach.
Why shoot a sci-fi film live?
Lillian Diaz-Przybyl: To some extent, because we could! We’d been doing a lot of experimenting in the live space and worked heavily with our friends at Blackmagic to integrate everything from our cameras to our switchers and recording setup, making everything interoperable and seamless.
Steven Calcote: There’s nothing like the adrenaline of real-time storytelling, but even though we filmed it live, we used a post mindset from the start. We always planned to release the project in two stages and so we needed clean ISO recordings of everything.
With as many as 10 cameras operating at any time (including Blackmagic’s Ursa Mini Pro and the Micro Studio 4K) it was crucial that we preserved every frame of the interactive live show (think “rough cut”) for the editing needed to create our definitive version.
While the show was initially live-edited with the ATEM Television Studio 4K, we recorded clean feeds of all cameras through a stack of HyperDeck Studios.
How does prep change when filming for a live audience?
Diaz-Przybyl: Our motto as an organization has always been “fix it in pre,” but you have to double-down on that for live — from previsualizations to prepping all of our VFX for the projection screens and ship-board monitors ahead of time.
Calcote: I find that live narrative requires a deeper understanding of the world, characters and plot by everyone on both sides of the camera. But more fundamentally, all the on-set elements require a high degree of functionality that actors can interact with while filming.
For instance, our interface designer Jason Milligan used the interactive digital prototyping tool ProtoPie to create functional touch interfaces for the spaceship using NASA UI as a reference. These files were all preserved for post as well in case any compositing touch-ups were needed.
What changed between the live cut and the final cut?
Diaz-Przybyl: We wanted to honor and celebrate the original live cut, so the changes between that and the final are extensive, but generally subtle. We are an Adobe shop, so we pulled all our ISOs and the live line-cut into Premiere (averaging 12 to 15 video tracks). This then allowed us to easily tweak our camera choices, or the timing of cuts.
Calcote: The one area we haven’t been able to conquer for live narrative is post-style color grading with multiple power windows, moving mattes and more. I was excited to create a new workflow with our very patient colorist, Nick Novotny, since we wanted to keep our editing choices flexible even after sending the show to color using Resolve. Rather than collapsing the edit to a single track like a traditional turnover, we preserved eight active camera tracks per episode and transferred editorial sequences from Premiere to Resolve via XML.
Given that we significantly updated the look of the show with a color palette, grain and falloff that evoked 1970’s Soviet science-fiction films, this allowed us to reexamine some of our edit choices without requiring a new round trip from Premiere.
What about the mix?
Diaz-Przybyl: That was another critical area — the live sound mix. After sound designer Alex Choonoo played thousands of sound effects live using Ableton for the initial broadcast, he output all of those files as new audio ISOs, and then expanded out the series’ SFX bed with hundreds of new sounds.
From there, post sound mixer Ben Chan took the 32 dialogue and music tracks recorded from the Behringer X32 during the live show and then blew us away with his new Avid Pro Tools mix for the dozens of tracks we brought back online for the final cut.
Is this two-stage release format here to stay?
Diaz-Przybyl: Especially with the rise of virtual production, I think it’s likely that this process will continue, and expand. The tools are there, even for smaller organizations like ours. Live performance gives the audience a reason to tune in and gives “appointment viewing” a lot more appeal. But it’s great to take the time to polish and refine what is essentially a “live rough cut” to get it to that next level, with real staying power.
Calcote: I totally agree with Lillian that “virtual production” is driving a filmmaking renaissance, where we’re striving to get final pixels during shooting. (This is a focus on our next narrative project using Unreal Engine.) But audiences also want to be involved with storytelling like never before, given the rise of platforms like Twitch and TikTok.
Taking a two-stage release — where you shape a live rough cut with your biggest fans and then release a final version featuring a full theatrical-level post process — satisfies audiences in a whole new way. Isn’t that what the evolution of storytelling is all about?