The latest season of BuzzFeed’s food show Making It Big required an innovative approach to both satisfy current fans of the 2D show on Tasty’s YouTube channel and new viewers of the VR edition in Meta Horizon World, “Mega Tasty” and Meta Quest TV. Butcher Bird solved the challenge by adding Canon’s stereoscopic VR lens to a multi-camera broadcast television workflow.
The two versions of the show demanded a different style and pace, but the production schedule and massive food creations — such as a 50lb gummy bear or 30lb cinnamon roll — required everything to be filmed simultaneously, without changing setups to accommodate the separate needs of 2D and VR.
Butcher Bird’s Steven Calcote, this season’s director, explains how their framing of host Twaydabae created an additional challenge: “We wanted VR viewers to feel like they’re just hanging with Tway in her kitchen as she speaks directly to them. But that meant the VR camera also had to supply the 2D wide shot, since we couldn’t shoot multiple takes.”
The production team developed a unique workflow based on their experience creating several VR shows with Canon over the last year and filming with the Canon RF5.2mm F2.8 L dual fisheye lens.
“The combination of that lens with a Canon EOS R5 was the first time we’d seen VR footage with enough resolution and quality that we could extract a beautiful 16×9 image at the same time,” explains Butcher Bird executive producer MeeRa Kim. “Therefore, we knew this setup would also create a perfect wide shot for our 2D show. We were then able to cut and color it seamlessly with the other Canon EOS cinema cameras on the show: two C-300 Mark IIIs for coverage and a C70 for overhead inserts.”
While the hybrid shooting format supplied editorial with ample 2D and VR footage, lead 3D editor Mason Ross says the additional challenge was matching styles between two mediums.
“VR generally favors longer shot lengths to keep viewers comfortable and oriented. But the previous 2D seasons of Making It Big had already established a fun, kinetic cutting style with shot lengths averaging around 2.5 seconds. So we matched that energy in VR by adding floating 2D windows of alternate video angles, locked-off jump cuts to speed up time, and dynamic VR graphics layers.”
Viewers who watch both versions of an episode will experience a different perspective on making giant food with Tway, but Calcote admits the VR headset (they used the Meta Quest Pro 2) version results in the most visceral thrills. “When she pours gallons of hot cherry liquid into a giant gummy bear mold right in front of you, you’re going to instinctively lean back to give her more room!”
For post, Butcher Bird used Adobe Premiere Pro with the Canon EOS VR plugin, DaVinci Resolve 17 running on PCs equipped with Nvidia GeForce RTX 3090 GPUs.
We reached out to Calcote to find out more…
How was this project different from others that you’ve done?
Our past projects with dual VR and 2D delivery always required separate shooting setups and scheduling, which increased both cost and production time. With a simultaneous hybrid workflow, we were able to shoot both formats at once, which is great all around, from budget considerations to talent energy and performance.
This is also the first time we were able to seamlessly cut in VR footage to a 2D show (i.e. 16×9 extracts of VR frame) at the same time we cut 2D footage into a VR show (as floating picture-in-picture windows). Because we shot everything using the Cinema EOS broadcast workflow, all our footage displayed the same cinematic color and quality despite radically different formats.
Did the technology make your job as director easier on this one?
Although I love using the latest tech as a director, the only thing that really matters is how the audience feels while they’re watching the finished show. I knew we were on the right track when our first viewers in the headset kept saying things like, “Wow, this is beautiful!” I’m heartened that cinematic quality means as much in VR as it does on the big screen.