NBCUni 9.5.23

VFX and Post for Netflix’s Our Universe

UK-based Lux Aeterna played a significant role in the making of the Netflix series Our Universe, bringing its extensive visual effects expertise to the project, alongside the production team at the BBC Science Unit.

Director Stephen Cooter

We reached out to Lux Aeterna’s VFX director Paul Silcox and CEO/creative director Rob Hifle, Our Universe director Stephen Cooter and the team at Halo Post discuss the challenges they faced and how they tackled them.

How did Our Universe stand out against similar productions you’ve worked on? From both a VFX perspective and the overall look and direction of the series?
Stephen Cooter: It was a pretty unique proposition — combining natural history with VFX to tell the animals’ stories, not just in the context of the planet’s history, but in how they connect with the story of the universe itself, was something that I don’t think had ever been attempted before. We chose to shoot in a 2.39:1 aspect ratio to give the series the cinematic look that the epic nature of the storytelling required. Taking the science fiction films of Steven Spielberg and JJ Abrams as reference, we shot with anamorphic lenses where possible — using the characteristic lens flare to tie the natural history footage together with the VFX space sequences.

Our Universe

Rob Hifle

From the outset, we knew that linking the wildlife stories to the universe narrative was a vital part of the series. We worked closely with Lux Aeterna to develop transitions between the two strands and sequences, where we would integrate VFX within the natural history footage to connect our hero creatures to the cosmos.

How early on did Lux Aeterna get involved in the project? Had you collaborated with the BBC before?
Rob Hifle: We have a long-lasting relationship with the BBC Science Unit. We’ve worked on a multitude of Prof. Brian Cox series over the last 15 years, including Wonders Of the Universe, Human Universe and Wonders Of Life. We collaborated on the award-winning 8 Days: To The Moon & Back, which was a documentary drama featuring real declassified astronaut cockpit audio from the Apollo 11 mission, with actors “lipsyncing” the words. We used immersive VFX techniques to establish a first-person astronaut POV on the moon.

Our Universe

Paul Silcox

The BBC Science Unit has always wanted creative cutting-edge techniques in order to showcase its latest science revelations, so it’s always been a good fit with us, especially with our R&D department.

Were there any particularly complex scenes and if so, how did you navigate them?
Paul Silcox: Our Universe challenged us in many ways, both technically and creatively. We might be thinking big one day, visualizing the invisible forces that protect our atmosphere or erode black holes. The next day we might have to demonstrate fusion at an atomic scale. We were working with extremely large data sets so while we were art directing the collision of planets or destroying moons, we would also have to solve that technically.

Aside from the directors, how closely did you work with the DPs?
Silcox: We worked closely with Mike Davis (the showrunner) and all of the directors to craft the vision for the show. Our brief was to create a cinematic, immersive, entertaining and scientifically accurate depiction of the forces that shaped the lives of the animals featured in the show. By collaborating closely with the team throughout the process, we were able to keep this goal alive and deliver a cohesive vision.

Tell us about the experience on the virtual shoot. What was your input from a VFX perspective here? Did director Stephen Cooter provide a brief beforehand?
Hifle: I’ve worked with virtual studios on numerous occasions but never with a wild “habituated” bear. It was an amazing experience to work on this shoot. The crew in Hungary was highly skilled. There was a huge amount of planning from Stephen Cooter and the BBC in making sure that everything was considered and covered… as much as it’s possible to plan to work with an unpredictable brown bear! That meant we needed to work quickly in order to keep the bear’s time on-set to a minimum.

Courtesy of Stephen Cooter

We planned and artworked all the backplates beforehand, but there was still a need to work alongside the director and DP on-set to get the desired lighting and perspective as well as any last-minute amendments. With the virtual studio backplates, I was able to move the elements around on the screen, such as the moon. This meant we could frame up really quickly using the Technocrane and make changes to the backplates while working with the foreground bear on a practical rock. This flexibility meant that it worked really well for all departments.

Did you come up against any challenges during the shoot? If so, how did you resolve them?
Cooter: There were some sequences that were really important to the narrative of the films but that would’ve been very difficult or dangerous to shoot in the wild, so we needed to take a different approach. For example, to illustrate the connection between the Alaskan brown bear and the moon, we used a virtual studio and backdrops provided by Lux Aeterna to achieve the shot.

How closely did you work with Halo Post to post the final project? Was it full post — including edit, grade and finish? What about audio?
Cooter: We were with Halo for the offline edit and finishing. That meant we were able to call on their sound team — led by dubbing mixer Sam Castleton — throughout the editing process not only to provide a library of sounds, but also to design specific sequences where the audio was crucial to the impact and drama of the universe VFX shots. The planetary-scale collision that created the moon in Episode 3, “Turning Seasons,” is a good example of this. We used Avid Media Composer.

For the natural history sequences, the sound design was done by Wounded Buffalo.

Sam, can you talk about the audio post on this one?
Sam Castleton: The scale of Our Universe was gigantic and presented us with some amazing opportunities and interesting challenges. This led us to create some incredible sound design moments. Mixing it in Dolby Atmos enabled us to achieve the scale and definition required. It also enabled us to bring the stories to life in interesting ways, such as water shooting out of the earth’s core, stardust falling from the sky, photons bursting out of the sun and protostars propelling themselves into space. The sound of the series is very brave, bright and full. We are very proud of what we achieved.

What about the color grade?
Cooter: The series was graded by Halo senior colorist Duncan Russell. Working together, we approached the films on a scene-by-scene basis, giving each one its own look, depending on the mood and atmosphere each scene demanded. While we wanted to preserve the naturalistic look of the natural history sections, grading the universe VFX in HDR and delivering in Dolby Vision allows you to push these sequences much further — they feel like exactly the kind of thing HDR was invented for.

Duncan, what was the challenge for you?
Duncan Russell: The challenge was to push it as far as it would go but still be able to match it all up. When I saw the first rushes coming back from Australia and Southern Africa, I knew we were onto something special. I had never seen natural history made with such visual flamboyance, and the use of anamorphic lenses for large parts was a masterstroke.

The directors encouraged me to take things into the cinematic realm, not to be restrained by existing styles and to find a visual language that pushed at the edges of what a “nature show” could look like. I am more than proud to be involved.

What tools were used for the VFX?
Silcox: We used SideFX, Houdini and Foundry Nuke. The power and flexibility of these tools make delivering cutting-edge visuals possible. We manage and render our VFX with ShotGrid and Deadline, which are both essential components of our pipeline.

 


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.