Tag Archives: visual effects

Creating Titles for Netflix’s Avatar: The Last Airbender

Method Studios collaborated with Netflix on the recently released live-action adaptation of the series, Avatar: The Last Airbender. The series, developed by Albert Kim, follows the adventures of a young Airbender named Aang, and his friends, as they fight to end the Fire Nation’s war and bring balance to the world. Director and executive producer Jabbar Raisani approached Method Studios to create visually striking title cards for each episode — titles that not only nodded to the original animated series, but also lived up to the visuals of the new adaptation.

The team at Method Studios, led by creative director Wes Ebelhar, concepted and pitched several different directions for the title before deciding to move forward with one called Martial Arts.

“We loved the idea of abstracting the movements and ‘bending’ forms of the characters through three-dimensional brushstrokes,” says Ebelhar. “We also wanted to create separate animations to really highlight the differences between the elements of air, earth, fire and water. For example, with ‘Air,’ we created this swirling vortex, while ‘Earth’ was very angular and rigid. The 3D brushstrokes were also a perfect way to incorporate the different elemental glyphs from the opening of the original series.”

Giving life to the different elemental brushstrokes was no easy task, “We created a custom procedural setup in Houdini to generate the brushstrokes, which was vital for giving us the detail and level of control we needed. Once we had that system built, we were able to pipe in our original previz , and they matched the timing and layouts perfectly. The animations were then rendered with Redshift and brought into After Effects for compositing. The compositing ended up being a huge task as well,” explains Ebelhar. “It wasn’t enough to just have different brush animations for each element, we wanted the whole environment to feel unique for each — the Fire title should feel like its hanging above a raging bonfire, while Water should feel submerged with caustics playing across its surface.”

Ebelhar says many people were involved in bringing these titles to life and gives “a special shout out to Johnny Likens, David Derwin, Max Strizich, Alejandro Robledo Mejia, Michael Decaprio and our producer Claire Dorwart.”

VFX Supervisor Sam O’Hare on Craig Gillespie’s Dumb Money

By Randi Altman

Remember when GameStop, the aging brick-and-mortar video game retailer, caused a stir on Wall Street thanks to a stock price run-up that essentially resulted from a pump-and-dump scheme?

Director Craig Gillespie took on this crazy but true story in Dumb Money, which follows Keith Gill (Paul Dano), a normal guy with a wife and baby who starts it all by sinking his life savings into the GameStop stock. His social media posts start blowing up, and he makes millions, angering the tried-and-true Wall Street money guys who begin to fight back.Needless to say, things get ugly for both sides.

Sam O’Hare

While this type of film, which has an all-star cast, doesn’t scream visual effects movie, there were 500 shots, many of which involved putting things on computer and phone screens and changing seasons. To manage this effort, Gillespie and team called on New York City-based visual effects supervisor Sam O’Hare.

We reached out to O’Hare to talk about his process on the film.

When did you first get involved on Dumb Money?
I had just finished a meeting at the Paramount lot in LA and was sitting on the Forrest Gump bench waiting for an Uber when I got a call about the project. I came back to New York and joined the crew when they started tech scouting.

So, early on in the project?
It wasn’t too early, but just early enough that I could get a grip on what we’d need to achieve for the film, VFXwise. I had to get up to speed with everything before the shoot started.

Talk about your role as VFX supervisor on the film. What were you asked to do?
The production folks understood that there was enough VFX on the film that it needed a dedicated supervisor. I was on-set for the majority of the movie, advising and gathering data, and then, after the edit came together, I continued through post. Being on-set means you can communicate with all the other departments to devise the best shoot strategy. It also means you can ensure that the footage you are getting will work as well as possible in post and minimize costs in post.

I also acted as VFX producer for the show, so I got the bids from vendors and worked out the budgets with director Craig Gillespie and producer Aaron Ryder. I then distributed and oversaw the shots, aided by my coordinator, Sara Rosenthal. I selected and booked the vendors.

Who were they, and what did they each supply?
Chicken Bone tackled the majority of the bluescreen work, along with some screens and other sequences. Powerhouse covered a lot of the screens, Pete Davidson’s car sequence, the pool in Florida and other elements. Basilic Fly handled the split screens and the majority of the paint and cleanup. HiFi 3D took on the sequences with the trees outside Keith Gill’s house.

I also worked closely with the graphics vendors since much of their work had to be run through a screen look that I designed. Since the budget was tight, I ended up executing around 100 shots myself, mostly the screen looks on the graphics.

There were 500 VFX shots? What was the variety of the VFX work?
The editor, Kirk Baxter, is amazing at timing out scenes to get the most impact from them. To that end we had a lot of split screens to adjust timing on the performances. We shot primarily in New Jersey, with a short stint in LA, but the film was set in Massachusetts and Miami, so there was also a fair amount of paint and environmental work to make that happen. In particular, there was a pool scene that needed some extensive work to make it feel like Florida.

The film took place mostly over the winter, but we shot in the fall, so we had a couple of scenes where we had to replace all of the leafy trees with bare ones. HiFi handled these, with CG trees placed referencing photogrammetry I shot on-set to help layout.

There was a fair amount of bluescreen, both in car and plane sequences and to work around actors’ schedules when we couldn’t get them in the right locations at the right times. We shot background plates and then captured the actors later with matched lighting to be assembled afterward.

Screens were a big part of the job. Can you walk us through dealing with those?
We had a variety of approaches to the screens, depending on what we needed to do. The Robinhood app features heavily in the film, and we had to ensure that the actors’ interaction with it was accurate. To that end, I built green layouts with buttons and tap/swipe sequences for them to follow, which mimicked the app accurately at the time.

For the texting sequence, we set up users on the phones, let the actors text one another and used as much of it as possible. Their natural movements and responses to texts were great. All we did was replace the bubbles at the top of the screen to make the text consistent.

For Roaring Kitty, art department graphics artists built his portfolio and the various website layouts, which were on the screens on the shoot. We used these when we could and replaced some for continuity. We also inserted footage that was shot with a GoPro on-set. This footage was then treated with rough depth matte built in Resolve to give a low-fi cut-out feel and then laid over the top of the graphics for the YouTube section.

The screen look for the close-ups was built using close-up imagery of LED screens, with different amounts of down-rez and re-up-rez to get the right amount of grid look for different screens and levels of zoom. Artists also added aberration, focus falloff, etc.

Any other challenging sequences?
We had very limited background plates for the car sequences that were shot. Many had sun when we needed overcast light, so getting those to feel consistent and without repeating took a fair bit of editing and juggling. Seamlessly merging the leafless CG trees into the real ones for the scene outside Keith Gill’s house was probably the most time-consuming section, but it came out looking great.

What tools did you use, and how did they help?
On-set, I rely on my Nikon D750 and Z6 for reference, HDRI and photogrammetry work.

I used Blackmagic Resolve for all my reviews. I wrote some Python pipeline scripts to automatically populate the timeline with trimmed plates, renders and references all in the correct color spaces from ShotGrid playlists. This sped up the review process a great deal and left me time enough to wrangle the shots I needed to work on.

I did all my compositing in Blackmagic Fusion Studio, but I believe all the vendors worked in Foundry Nuke.

Felix Urquiza

AFX Creative Adds Felix Urquiza as ECD

Creative studio AFX Creative has beefed up its VFX team with the addition of executive creative director Felix Urquiza. He joins with nearly 20 years of experience in the field, working at companies like Method Studios; The Mill; and Team One, heading up the latter’s VFX/CG division TiltShift under the Team One/Publicis Groupe’s USA umbrella.

In his new role at AFX, Urquiza will lead the creative team and develop new strategies. In addition, he will work closely with the studio’s managing director, Nicole Fina, to introduce new clients to AFX and expand its services beyond what it currently offers.

“My goal is to bring a fresh perspective, something more personal and meaningful that will resonate not only with our internal teams but also our clients,” Urquiza notes. “Our work and capabilities are already there, and I am here to help take it to the next level. However, what’s more important to me is bringing an outside perspective to AFX. This will push our team and clients to a higher level of excitement and commitment, elevating our passion and vision of creativity.”

Throughout his career, having an outside perspective is what has propelled Urquiza from being a go-to VFX artist to a creative director and studio director. “I would describe my visual style as modern, clean-cut and pristine,” he explains. “Throughout my career, I have developed both technical and creative skills, and as a result, have become proficient in several areas, including building decks and treatments, writing and designing my own treatments for pitches, and leading the team.”

Early on, Urquiza was inspired to pursue VFX after seeing two James Cameron films. “When I was around 10 to 12 years old, there were two movies that blew my mind,” he recalls: “The Abyss and Terminator 2: Judgment Day. In The Abyss, there is a moment when a ‘water’ creature appears and forms into a girl’s face. I couldn’t understand how they did that. Ever since then, I have been fascinated by movies and how they bring amazing things to life using computers. In my sophomore year of high school, I took an elective for 3D graphics, and on the very first day of that class, I knew this is what I wanted to do. I started researching and connecting the dots, laid out my plan and moved to California. The rest is history.”

Urquiza has used that inspiration while working on projects for Activision, Nike, Bacardi, Samsung, Apple, Lexus, GM, Toyota and many more. In addition, he’s collaborated with agencies such as Team One, Saatchi & Saatchi, Leo Burnett, BBDO, McCann, Omnicom and Argonaut.

What he considers to be his primary career highlights include working on his first-ever film, Pirates of the Caribbean: At World’s End; doing a shoot with Zack Snyder during the opening weekend of 300; working on the game XCOM: The Bureau and being nominated twice for a VES award.

“During my time working at places like The Mill and Method, I gained a lot of experience in understanding what it takes to achieve high-quality work and striving to be the best in the industry,” he says. “I also learned the importance of committing to providing a personalized experience for our clients. At TiltShift, I gained valuable insights into the business side of things, such as navigating holding companies and how the decision-making process impacts the overall success of a business. Drawing from these experiences, I am confident in my ability to set high standards for creative output, collaborate effectively with clients and bring strategic ideas to the table on the business end of things.”

 

Digital Domain’s VFX for Chafa and More for Marvel’s Echo

For the premiere season of Marvel Studios’ Echo, the visual effects team at Digital Domain helped visualize the origin story of the Choctaw people, a Native American tribe belonging to the southeastern part of the United States. To bring the story and the opening scene to the screen, Digital Domain worked closely with production and the Choctaw Nation to ensure the visuals and storytelling held true to the Choctaw culture.

The series opens in a dark cave located inside the Earth where we see Chafa, the first Choctaw, covered with clay, emerge from a glowing, swirling pool of magical blue water. Chafa exits the pools and drinks the water. As she stares down at her hands, tattoos glow and swirl onto her hand, and more clay people begin to appear. A bishinik bird then lands in her palm before flying away as an earthquake begins and the cave starts to collapse.

Chafa holds up the cave as the clay people seek safety. The cave collapses and in the next scene, Chafa and her people are seen in a field with trees alongside a grass mound. The clay begins to crack and shed from their bodies, and they are revealed in human form. The scene closes as the mighty Chafa leads her people.

The Cave
To bring this sequence to life, Digital Domain’s team of VFX artists focused on several key areas, including the cave, the pool, the clay people and the bishinik bird. The team tackled the cave environment first. Artists digitally built the cave asset, and because of lighting and the ambient atmosphere in which the cave was shot, they replaced the majority of the set, including the columns. This sequence also required a fair amount of roto work, as the clay people needed to be roto’d out to recreate the cave.

The Pool
To create the pool in the show, the team had to replace and simulate the practical pool. Artists created many iterations and gave the pool a celestial galaxy-like, swirly design. The sequence with Chafa emerging from the water was shot practically so the Digital Domain team replaced the actor’s body, excluding her face. The actor could be seen waiting for a cue to emerge, so the team painted her out. Artists also simulated the actor’s body exiting the pool and the water that dripped from her.

Clay Transformations
The VFX team created two digidoubles showing the transformation as the characters shed the clay, revealing their human form. The Digital Domain team worked closely with the production team for this scene because the way the clay dried, cracked and peeled from the skin was art-directed. The close-up shots of the hands were complex due to the layers and lines within the skin. Additionally, the team created the mound, the grass and the trees for the background of the environment.

The VFX team at Digital Domain used Autodesk Maya for animation and layout, Maxon ZBrush and Foundry Mari for texturing and modeling; SideFX Solaris for rendering and simulation inside of Houdini (also from SideFX) and Foundry Nuke for compositing.

The Digital Domain team also collaborated with another VFX vendor, ILM, sharing the asset for the biskinik bird. Although the bird was only in about five shots, creating the bird’s feathers was extremely intricate. For Episode 5, “Maya,” Digital Domain’s VFX team animated last moment of the pow-wow scene.

Behind the Title: BlueBolt VFX Supervisor David Scott

David Scott is a visual effects supervisor at London-based BlueBolt, an independent studio that provides VFX for television and film.

“It’s run by a great bunch of industry pros, a lot of whom I’d worked with before in previous companies, like MPC,” explains Scott. “What is nice about being in a smaller company is the scope of work you get to do and the types of films and projects you work on. Your involvement in it is much more than in bigger studios, where things are much more departmentalized. Plus, you get to know almost everyone in the company, which is definitely not the case in bigger ones.”

Let’s hear more from Scott…

What does the role of VFX supervisor entail?
My primary responsibility is to ensure that the director’s vision and expectations are brought to fruition. The process can start during preproduction, where we break down the script, discuss approach to shooting and identify where VFX may be required. Collaborating closely with the production team, we plan the shoot to capture the necessary elements for the shots.

David Scott

The Great

Once the shoot concludes, my focus shifts to the post phase at BlueBolt. Here, we discuss the specific requirements for each shot and plan our approach. Throughout the VFX process, we maintain regular reviews with the director. Our involvement extends into the digital intermediate stage, ensuring our contribution until the final shot is graded and officially locked. It’s a comprehensive journey from initial concepts to final shots, with constant collaboration to achieve the desired look.

What would surprise people the most about what falls under that title?
The number of meetings and reviews each shot has before it’s presented as final.

How long have you been working in VFX, and in what kind of roles?
I have been working in VFX for 20 years. I’ve worked in different companies throughout my career, mainly in London but also for a number of years in New Zealand. I started in the rotoscoping department, moving into prep and then compositing. Within compositing, I’ve been a lead and a comp supervisor, and for the past three years I’ve been VFX supervising.

The Great

How has the VFX industry changed in the time you’ve been working? The good and the bad.
So many aspects have changed, but the first thing that comes to mind is that the scale and complexity of projects has grown massively throughout my career in VFX. Before, a 300-shot show would book out a whole facility, whereas now the larger VFX houses can handle multiple shows, each with thousands of shots.

The upside is that we’re tackling more ambitious projects, pushing the boundaries of what’s visually possible. However, the downside, is that timeframes haven’t kept pace with this expansion. The challenge lies in delivering high-quality work within the same, if not tighter, schedules.

Do you like being on-set for shots? What are the benefits?
There’s a unique energy and immediacy to the on-set environment. Being there allows for instant problem-solving, better collaboration with the production team and an intuitive understanding of the director’s vision. It’s all about soaking it up and ensuring the VFX fits seamlessly into the shots.

What do you see as a big trend that is happening now or maybe is on the verge of happening? Is it AI? If so, what are your thoughts on how it could be used for the good and not the bad in VFX?
Absolutely, AI and machine learning are undeniably making a significant impact on the world of VFX. While headline-grabbing applications like deepfakes and de-aging are understandably in the spotlight, the benefit of AI across the whole VFX workflow will bring massive gains.

David Scott

The Great

As these technologies develop, there’s immense potential for efficiency enhancement, optimizing the day-to-day processes. When integrated thoughtfully, AI has the power to become a valuable ally, boosting productivity and increasing creativity in the VFX industry.

Did a particular film inspire you along this path in entertainment?
There are so many from my childhood, but the standout is Who Framed Roger Rabbit. I remember they promoted it with a lot of behind-the-scenes information about the technology and techniques used, which I found so fascinating.

Where do you find inspiration?
My inspiration comes from everywhere. Reference is key when tackling shots, so I enjoy delving into stock footage sites, exploring YouTube and referencing other movies.

What’s your favorite part of the job?
I love that every show comes with its own set of challenges to solve, both technical and creative. Working with so many talented people, sharing ideas and developing them together is my favorite part.

If you didn’t have this job, what would you be doing instead?
Definitely graphic design. I studied graphic design at college and worked doing that for four years before making the jump into VFX.

David Scott

The Great

Can you name some recent work?
I’m currently working on Nosferatu. Previous work includes, The Northman, The Great (Season 3), Avengers: Endgame and James Bond’s No Time to Die.

What tools do you use day to day?
Most of my day is spent in RV reviewing shots and in ShotGrid for everything else show-related. And if I need to work on specific shots, I’ll use Nuke for compositing.

Finally, what do you do to de-stress from it all?
When I’m mid-project, I find it hard to fully switch off, so exercise becomes key to relieve the stress. And if I have free time, the weather is good and the stars align, then I’ll play some golf.

Foundry Flix 7.0

Foundry Releases Flix 7.0 for Streamlined Preproduction

Foundry has launched Flix 7.0, an update to its preproduction software that helps studios develop stories by managing editorial round-tripping, storyboard revisions, file versioning and more.

Now offering integration with Autodesk Maya, Flix 7.0 enables both 2D and 3D artists to collaborate from anywhere globally using Flix as a central story hub. Snapshots and playblasts can be imported from Maya into Flix 7.0 as panels, then round-tripped to and from editorial. Flix manages naming, storing and organizing all files, as well as allows teams to provide feedback or revisit older ideas as the story is refined.

Foundry Flix 7.0

While Flix also connects to Adobe Photoshop and Toon Boom Storyboard Pro, the Maya integration provides the ability for layout and storyboard teams to work in tandem. These teams can now collaborate concurrently to identify areas for improvement in the story such as timing issues before they become too complicated and expensive to change later in production. 2D artists can bring Flix’s Maya panels into their drawing tool of choice so that they can trace over the viewport for faster storyboarding. 3D artists can reference 2D storyboard panels from Flix directly in Maya when building complex scenes or character models, providing additional time savings.

Flix 7.0 simplifies building new extensions with a new Remote Client API. This API allows studios to create custom tools that integrate with Flix using the same API as the built-in extensions for Maya and Photoshop. Documentation and example code for the Remote Client API are provided to help studios build custom integrations with their choice tools or to create entirely custom workflows. Flix 7.0’s new extension management system enables studio supervisors to test, update and audit all extensions, with the added ability to deploy them across production from a single place.

Flix 7.0 offers single sign-on (SSO), so IT teams can authenticate Flix users through their studio’s existing SSO platform to centrally manage secure access to story development assets to both staff and freelancers. Flix also supports multi-factor authentication to provide an added layer of security.

Other new features in Flix 7.0 include:

  • New metadata system — Scene data is now stored directly on each Flix panel. For example, for Maya users, global cameras, locators and file path data will be recorded for assets selected in the viewer.
  • Enhanced Adobe Premiere plugin — A multitude of updates and a new UI for the Flix Premiere Adapter eliminates limitations of previous versions, providing an efficient editorial workflow.
  • Photoshop plugin redesign — The Photoshop extension has been rebuilt, bringing users new UI customization options.
  • Updated notification preferences — The ability to turn off automatic email updates each time a panel is published or changed.

 

Poor Things

Union Provides VFX for Oscar-Nominated Poor Things

Union VFX provided 177 visual effects shots for Best Picture nominee Poor Things, which was directed by Yorgos Lanthimos through Element Pictures for Searchlight. Poor Things, written by Tony McNamara, is based on the 1992 novel of the same name by Alasdair Gray. The plot focuses on Bella Baxter, a young Victorian woman brought back to life by the unorthodox Dr. Godwin Baxter (Willem Dafoe). Hungry to see and experience the world, Bella runs off on cross-continent adventures with debauched lawyer Duncan Wedderburn (Mark Ruffalo), and in time she grows steadfast in her stand for equality and liberation. Both Stone and Ruffalo got Oscar nods for their work on Poor Things, which picked up 11 nominations in total, including for Best Director.

Poor Things embraces authentic artificiality, blending a classical sensibility with a fantastical and science fiction-driven aesthetic. These surreal settings were crafted with Lanthimos’ vision in partnership with production designers James Price and Shona Heath, cinematographer Robbie Ryan (BSC, ISC) and the Union VFX team, led by creative director Simon Hughes and VFX producer Tallulah Baker, who were involved in the film from the earliest stages of preproduction.

Creating this fanciful place required a wide variety of complex and technical VFX that, whether subtle or obvious, fell seamlessly into the weird and wonderful world of Poor Things. To facilitate the numerous techniques involved, Union designed a variety of bespoke workflows specifically for the creative and diverse VFX required throughout the feature.

The sheer size, scope and breadth of the film’s production begins in a largely self-contained mansion set that expands to increasingly gargantuan scales throughout the film as Bella furthers her journey of discovery, both internally and externally.

This journey brings Bella to London, Paris, Lisbon and Alexandria, which look like works of art in their own right but also function within the story and fabric of the film. The decision to shoot on film — in a combination of both color and black-and-white, with periodic use of fisheye lenses — added layers of complexity and challenges for VFX, particularly when creating and extending the environments.

LED Screens
Tim Barter was Union’s on-set VFX supervisor during the shoot in Budapest, where 11 giant (70m by 90m) wraparound LED screens were used to project some of the film’s fantastical environments virtually during filming.

These “inky” sky and ocean environments were created referencing the work of artist Chris Parks. The team created CG ocean simulations and renders designed to work as 50-second clips at 24K; 11 accompanying digital matte painted skies with added cloud movement; and additional stylistic, moving atmospheric effects.

These “inky” sky and ocean environments were created referencing the work of artist Chris Parks. The team created CG ocean simulations and renders designed to work as 50-second clips at 24K, 11 accompanying digital matte painted skies with added cloud movement, and additional stylistic moving atmospheric effects.

Poor Things

LED backdrops gave the actors something to act against that wasn’t a greenscreen. This approach also provided beautiful reflections and a more impressive final result.

Miniatures and Environments
The decision to use miniatures was deliberate in terms of the look of the film, so the VFX environments had to be sensitive to this. Ensuring that the scale of the CG water worked in relation to the scale of the miniature was particularly challenging — especially when combined with live-action footage shot on-set.

The Alexandria environment involved vast establishing shots that pull out as wide as possible. The Union team used lidar scans of the miniatures as a starting point for this fully CG environment, which includes CG water, sky and palm trees as well as a fully CG boat, a CG cable car and a lot of FX simulation to enhance the atmosphere, including dust and chimney smoke.

The London environment was shot with an 8mm lens, so it was particularly stylized. The movement in the sky had to reflect the ocean, with water displacement and undulation. Tower Bridge was created as a miniature, and the London rooftops referenced 1950s filmmaking, so the team added various period signs of life to match the look of the film, including chimney smoke elements and fireworks. This environment also required creating CG zeppelins.

The Paris environment was shot as part of the studio build in Budapest and then enhanced in VFX by adding a more surreal CG sky and associated elements.

The Lisbon environment was also shot with an 8mm lens. The set was extended in CG, again with very stylized skies and the film’s signature surreal look. The character Alfie’s mansion was also a miniature within a fully CG environment. It required a huge amount of garden detail, such as covering shrubs and greenery, to create the gritty outside space.

Hybrid Animals
Another less than ordinary element of the Poor Things world is the hybrid animals, created by the doctor’s experimentation. These quirky creatures are present throughout the film, wandering around the grounds like barnyard creatures while reflecting the 1920s look of the film, movement and cinematography.

There are seven different hybrids in the final film, but the VFX process involved creating many more before the final seven were selected. The director wanted to find as much of an in-camera and 2D-solution-based as possible to embrace the random physical nuances of animal movements that are inherently difficult to capture in CG. Union’s solution was to overshoot, coming back with multiple takes and multiple animals and then testing different combinations to see which animals and moments worked well when combined together.

Poor Things

“It started with a series of test shoots with an animal trainer, which narrowed down the selection prior to the second unit shoot, as some animals just didn’t want to behave at all,” says Hughes. “When it came to combining them, some proved more difficult than others due to a combination of their independent movements, the camera moves and distorted lenses.

“There was a significant degree of rebuild, and some CG was used to help with the joins. And 3D scans of the animals were used to help us align the different elements and create the textures and scarring where they join together,” he continues. “The scar designs were based on paint-over concepts to preserve the naturalistic movement of the real animals while still creating a more fantastical layer of ‘strangeness’ in keeping with the film’s tone.”

There’s also a Frankenstein moment in the film, when Bella is brought back to life by the doctor. It required a large amount of power, electricity, lightning treatments and sparks. These were complex and involved a lengthy process to ensure they were sympathetic with the look of the film.

Union used Foundry Nuke for 2D work (compositing and more) and Autodesk Maya and SideFX Houdini for 3D-based work on the film.

The Marvels

Rising Sun Pictures’ VFX Destroy and Rebuild for The Marvels

For The Marvels, a new superhero offering from Marvel Studios, Australia’s Rising Sun Pictures was tasked with creating a vast city on the planet Hala, home to the Kree Empire and its artificial intelligence ruler, the Supreme Intelligence.

Artists created both a representation of the futuristic city at the height of its power and as a devastated ruin. The studio also produced the film’s opening and closing sequences showing the implosion and rebirth of Hala’s sun.

The sequel to the 2019 blockbuster Captain Marvel, The Marvels, directed by Nia DaCosta, marks the return of Carol Danvers’ Captain Marvel (Brie Larson) who has reclaimed her identity from the tyrannical Kree and taken revenge on the Supreme Intelligence. Rising Sun Pictures worked under production VFX supervisor Tara DeMarco.

First appearing in a flashback, Hala’s capital is a densely packed urban environment of towering skyscrapers and elegant homes arrayed along a crystal-blue ocean. “The skyline suggests Manhattan as it might look in some distant future,” says Rising Sun Pictures VFX supervisor Jamie Macdougall. “It’s filled with beautiful architecture and surrounded by lush forest.”

The studio developed the look of the city from concept art provided by the production. “We extrapolated on the drawings to produce a cityscape of thousands of buildings,” notes CG supervisor Prema Paetsch. “Our first task was to define the city’s superstructure and then fill it with a logical distribution of office buildings, elevated roadways, residential structures and landmarks. We also added human-scale detail such as doors and windows, rooftop gardens and trees. The challenge was to sell the size and scope of the city and to make it stylistically consistent without repeating patterns.”

“It’s one of the largest environments we’ve ever built,” notes Macdougall. “And it has enough detail to be viewed from any camera angle or perspective. It’s seen in flyovers. The camera also drops down to street level so that you can look into individual homes and offices and see things like lighting fixtures and furniture.”

Artists used particle effects to further bring the city to life. “We designed systems that could be attached to tiny vehicles to make them move along roads and skyways in a logical manner,” notes comp supervisor Neill Barrack. “A similar technique was used to make birds fly gracefully past camera.”

The MarvelsThe flashback ends with Captain Marvel destroying the city, including an immense green building housing the Supreme Intelligence. “We see her attacking the Supreme Intelligence, which appears as a giant, anthropomorphic computer,” Macdougall explains. “It explodes, with the blast spreading across the city. The next time we see the city, it’s a smoldering ruin. There is no water. The atmosphere has turned to poisonous smog. Its sun is dying.”

Artists added subtleties to suggest that the city has decayed over several decades. “Buildings are weathered, grimy and dirty,” says Paetsch. “What used to be clear glass is smudged. Metal objects have rusted. We added skeleton trees to rooftop gardens, withered plants to balconies. Smoke lingers in the air. Everything is dark and gloomy.”

Paetsch adds that working with an asset so large was challenging. They addressed that issue by making the city modular. “We had a large team working on the environment together,” he explains. “We managed the load by distributing subassets and substructures to individual artists across several departments. The environments team focused on the procedural, rule-based design, while the assets team focused on bespoke hero structures that are seen close-up and needed very specific designs. We ultimately had hundreds of subassets that could be checked out as modules and checked back in to the master system. The core of it all was a distribution logic that placed individual structures into the larger expanse in a defined order.”

Near the end of the film, Captain Marvel uses her powers to restore the city to its former splendor. “Massive winds blow through, bringing fresh air and pushing out the smog. Water is pumped in,” states Macdougall. “The buildings are still destroyed, but it’s evident the planet is on the mend.”

Equally impressive are the solar collapse and restoration that bookend the film. Macdougall notes that the production had a science advisor who provided insight into how stars die. “Quite a lot of thought went into how it should happen,” he recalls. “As the star dies and loses its fuel, it grows bigger and bigger before gravity kicks in, and it implodes. This process occurs over vast time frames, but since the imminent death of the sun is an important story point, our task was to take this concept and imagine it in a way that conveyed urgency. It provides weight and drama to Captain Marvel’s mission.”

At the end of the film, the process is reversed. Captain Marvel uses her expanded powers to restart the sun. Paetsch says that it was important that this spectacular transformation also appear convincing. “It took a lot of conceptual work and experimentation with different approaches to three-dimensional simulations,” he recalls. “There were multiple layers, complex details and structures within structures, all of which are moving.”

The MarvelsHe adds that they also had to integrate Captain Marvel into the scene. “She disappears into the sun, which is heated to millions of degrees, and our job was to make the audience believe that she is causing its regeneration to happen,” notes Barrack. “You see her energy beams emanating through the gaps and rippling across the solar surface as the crumbling structure fixes itself. It becomes smooth and beautifully bright. The team did a marvelous job in creating something that has never been seen before.”

The restoration of the sun gives way to a climactic view of the Hala capital once again bathed in light. “Hero shots like these are a wonderful opportunity for our team to shine,” concludes Macdougall. “Both the cityscape and the solar sequences were massive in size and scope. They challenged our ability to solve problems and gave us a chance to flex our creative muscles. The results look fantastic.”

Doctor Who

Untold Creates CG Meep for Doctor Who Anniversary Special

BAFTA- and Emmy-nominated VFX and production company Untold Studios has worked with Bad Wolf to help bring BBC’s 60th anniversary series of Doctor Who to life.

Doctor Who‘s enduring popularity over six decades has been sustained by its ability to reinvent itself while remaining true to its core values of adventure, imagination and timeless storytelling. The 60th anniversary series continues this tradition with three special episodes that will leave Whovians of all generations on the edge of their seats.

Untold’s VFX team contributed 330 shots for the episode “The Star Beast,” with a focus on bringing to life the famous and enigmatic Meep character.

Doctor WhoUntold Studios VFX Supervisor Tom Raynor explains, “It was a total privilege to take on the first of the Doctor Who specials, kickstarting a new season of the iconic TV franchise and the eagerly anticipated return of David Tennant. The brief was all-encompassing, calling for a plethora of visual effects and SFX, from complex character work, CG environment builds and set extensions to massive battle scenes and highly complex FX sequences. This special is an example of costume and SFX being seamlessly integrated with high-end VFX in a sensitive and impactful way.

The Meep is an iconic character, well-known by fans of the early comic books. Because this was the character’s first screen appearance, it was extremely important that we got this right. The character posed a unique set of challenges, as it required both a full-CG approach for some shots and an augmented 2D approach to an actor in a costume for other shots.”

For the augmented 2D approach, Untold used the tracked geometry of its CG asset to drive a 2D spline warp/ST map-driven facial rig in Nuke. The rig had sliders to dial in or enhance a range of different facial expressions and emotional states and could modify the mouth performance to accurately match lipsync. Untold’s compositors used this rig in almost every Meep shot in the first half of the episode. Once The Meep turns evil, the team used a CG digital replica of the character, complete with fully simulated muscle/fat and hair systems.

Untold does most of its asset-building and rigging in Maya, with ZBrush and Substance for sculpting and texturing. From there comes look development and asset rendering in Houdini. Untold considered using USD and Karma for rendering at the start of this project but settled on Arnold due to time constraints. The studio has developed a collection of Arnold shading nodes and utilities for fur over the past few years, which made it particularly appealing for creating The Meep.

The studio thought it was important to respect the aesthetic and charm of the Doctor Who franchise along with the artistry of the costume-makers and SFX artists. Untold’s creators took great care to match The Meep’s range of movement to what was achievable for the costume to ensure a consistent look and a seamless, invisible transition between full-CG shots and augmented costume shots. Using a CG digidouble made it possible to greatly increase the emotional range of The Meep during the later sequences. It also allowed the Untold team to do things like dilate pupils, articulate finger and toe movements, and make The Meep run, which was a practical limitation of having an actor crouched down in a costume. Artists paid careful attention to the flow of every part of the groom, the subtle pigmentation changes of the fur and how it bent and flexed as it moved.

Untold Studios uses an ACES color pipeline, and the team worked at and delivered the final picture at 4K. All of the plates were shot on an ARRI Alexa Mini except for a handful of drone plates.

 

 

VFX Supervisor Glen Pratt on Barbie’s Visual Effects Workflow

By Iain Blair

To be Barbie in Barbie Land is to be a perfect being in a perfect place. Unless you have a full-on existential crisis and suddenly develop flat feet and bad breath and end up traveling to the real world to find some answers.

That’s the clever setup for the biggest blockbuster of the summer — and now the biggest movie on the awards circuit thanks to its nine Golden Globe noms. Helmed by Oscar-nominated writer/director Greta Gerwig (Little Women, Lady Bird)) and starring Margot Robbie and Ryan Gosling, Barbie is a joyful celebration of girl power that showcases truly awesome visual effects overseen by production VFX supervisor Glen Pratt at Framestore.

Barbie

Glen Pratt

The film’s virtual production, previz, postviz and techviz were all done by Framestore’s London-based preproduction services team, led by Kaya Jabar. Final VFX were courtesy of VFX supervisor Francois Dumoulin and his Framestore team in Montreal.

I spoke with Pratt (Paddington 2, Beauty and the Beast) about the challenges and workflow.

How many VFX shots are there overall?
Around 1,600 shots were worked on during post. The final film had 1,300 in the end, which is a lot.

What were the big VFX challenges?
The biggest was dealing with Barbie Land, and making sure we kept the strong aesthetic that had been designed. We had many conversations with Greta about her desire for this to look as if Barbie Land is contained on a soundstage. For instance, the old classics like Singin’ in the Rain or The Wizard of Oz… their production values were a bit other-worldly, and they had a magic to them, and Greta wanted to make sure that feeling ran through the whole film. So working with her, DP Rodrigo Prieto (ASC, AMC) and production designer Sarah Greenwood, we decided to create the sets with painted backings to them.

Some of those sets were never going to be big enough for what Greta had in mind, so we had to come up with ways to use visual effects to help realize that world in tandem with the production design. A big part of that was to do some virtual scouting with Greta and Rodrigo early on to determine how large these spaces would be, as well as doing bits of visual development work. All that was to help her understand how big the stage should be before it became too big, at which point you’d lose that feeling and quality of it being on a stage.

The other big challenge was taking all that onboard with Greta’s idea of working with miniatures. Any time we were extending the world, we had to make sure it echoed and matched the sets we’d built so people wouldn’t question the visual effects. So we built a language that had consistency all the way through, that matched the production design sets and equally had a miniature feel to it. It always had to reflect the toylike aesthetic, so it was really a big balancing act and a matter of discovering the best language to use to tell the story.

Barbie

Barbie miniatures

You must have done some tests?
Yes, once we had decided the size of the sets and shot some exteriors, we started testing how it would all look. In the tests you could actually see the four walls of the stage, and it was a painted backdrop with the scenery in it — a bit like The Wizard of Oz or The Red Shoes. All that evolved once we got into post, and Greta felt it would look more charming if, rather than it being a painted cyc, it was actually a 3D CGI placement of buildings.

So once all the miniatures were built, they were scanned and captured with photogrammetry, and we then began to build assets from those miniatures and from the actual set itself so that the miniatures and sets married up. There’s both a miniature and a build of the Dreamhouse, so they’re essentially the same. We could take that and expand Barbie Land and make it a bigger world.

How closely did you work with Greta?
Very closely, along with Rodrigo Prieto and Sarah Greenwood as well as the producers… we would all bring ideas to the table. Greta would come to me with ideas, and I’d go off and work on them and then present her with options. It was a learning curve for her, but she was amazing. We’d show her stuff we had developed from what we’d shot, and she was bowled over. It was a real pleasure to work with someone who could see the potential of where it could go. She was so collaborative and positive all the time.

Barbie

Tell us about all the previz, postviz and techviz.
On each Barbie Land set, there’s always the question of “What’s over there?” If we put bluescreen there, what’s going to happen behind it? Before we began shooting, we began all the virtual scouting with Greta using Framestore’s virtual scouting system, Farsight Go, which is an on-set visualization tool that [allows us to preview a live composite of CG set extensions, objects, characters and animations within the physical set].

So, for example, if we’re looking in a certain direction, it very quickly falls off because that’s the end of the stage, but we were able to extend beyond that. And Greta was very mindful of not extending as if it was just the real world but instead keeping it in the style of filmmaking we had discussed. Even though there are VFX there, she wanted it to have that very heightened look and aesthetic of Barbie Land.

From the virtual scouting, we did shots on and around the beach for when Barbie first sees her Dreamhouse and drives downtown; we designed all the shots. Rodrigo chose the lenses so we knew exactly what we would see given the art department’s work on the designs. We figured all that out, and that formed the basis of what we used in the volume stage.

Barbie

Using the volume would give us the complexity of lighting we needed, but it meant there would never be finished pixels. The miniatures are the finished versions of a lot of those buildings, but because of the schedule, we wouldn’t have the finished material because the miniatures were still being built. Thankfully, we could plan the shots in virtual production and then [use the volume to?] see what areas we really needed to work on so we could be ready in time for the actual shoot in the volume. In post, we did this for Greta’s first director’s screening since pretty much every sequence contained postviz that we’d completed for the screening.

There are over 700 shots of Barbie Land, which are either entirely CG or needed a lot of additional CG work to finish the picture.

What did VFX supervisor Francois Dumoulin and his team in Montreal provide?
All of the Barbie Land work. We’d show them key areas of film we’d identified and pick key shots that needed look development. Those shots filtered into other shots around them, which meant we could bring the level of the work up bit by bit and keep showing the shots to Greta.

You also had FuseFX, Chicken Bone and UPP do some VFX work. Break it down for us.
Chicken Bone mainly did a lot of clean-ups that we identified in post, and Fuse did all the Mattel HQ scenes. This included the establishing shots outside, the interiors when Barbie is trying to escape the executives and the boardroom sequence.

UPP did a lot of work on the Venice Beach scenes, adding a lot of sky replacements since it was very overcast on the shoot. They opened up the skies, punched up the blues and added more broken-up clouds. They also did the whole car chase, which used a new electric car that UPP replaced with a CG vehicle — all the shots where you see the blue Chevy — and all the interiors. We shot all the interior car scenes at Leavesden [in the UK]. The actors were in the actual car onstage, and we shot driving plates for the chase route, which we could use in a small-volume LED setup. Then we corrected the perspective and redid the whole environment backgrounds. UPP did all of that. Clear Angle did all the lidar and scanning work.

You’ve worked on a lot of huge projects. Where does this rate in terms of complexity and challenges?
It’s right at the top. Being able to be in New York with Greta at Gloss for all the post was really key for all the VFX, as I could contribute ideas and help craft the final film.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

V-Ray 6 for 3ds Max, Update 2, Adds MaterialX Support for USD

Chaos has launched V-Ray 6 for 3ds Max, Update 2, offering new and enhanced support for some of VFX’s most popular file standards. With new support for MaterialX, artists can now complete their USD pipeline, enabling smooth rendering and shader exchanges across teams and tools. Support for the latest USD for 3ds Max version has also been added, bringing users up to date with the tool’s full capabilities.

This continues a USD/V-Ray development process that began with the release of V-Ray 5 for SideFX Houdini and Autodesk Maya in 2021. Since then, Chaos has embraced the standard, so artists can easily share and update assets while using its raytracing technology.

Speed Updates Include:

  • Chaos Cloud One-Click Submit – Users can now access Chaos Cloud rendering in a click, all without leaving 3ds Max or waiting for a scene to export.
  • Faster .vrscene Exports – Users can get up to 20x faster exports and far smaller file sizes on scenes with immense scattering data. This improvement also accelerates V-Ray GPU’s time to the first pixel.
  • Unified V-Ray Lister for 3ds Max – All lights and cameras can now be managed from a single location with this new Lister update. An enhanced user interface will also help users explore new features for bulk adjustments, filtering, searching and more.

V-Ray GPU Updates:

  • V-Ray Enmesh – Complex geometric patterns can be quickly created over object surfaces to make detailed panels, fences, fabrics and more in an automated and memory-efficient way.
  • Faster Animation Rendering – Image sequences can now be rendered up to 4x faster, accelerating everything from AEC marketing to high-end visual effects, as V-Ray GPU now caches bitmaps between frames.

More Realistic Scattering:

  • Groups Hierarchy – In Chaos Scatter, you can now distribute multiple objects in a group or hierarchy with just a couple of clicks.
  • Altitude Variance – Users can now control object distribution based on surface height, helping them achieve natural altitude variances for their vegetation.
  • Scatter Orientation – The orientation of scattered items can be adjusted by rotating away or toward a selected object for added realism and control.

V-Ray Frame Buffer (VFB) Improvements:

  • Chromatic Aberration – This new post effect helps artists explore distinctive looks and color fringing directly within the VFB, without a third-party application.
  • Custom Background Color – Color corrections now include a solid background option, removing the need to load an image.

Lighting:

  • IES Light Models in Chaos Cosmos – Chaos’ ever-expanding library of ready-to-render assets now helps artists import 3D light models, including IES profiles, in a few clicks. Interiors can now be illuminated faster than ever before.

Pricing and Availability

V-Ray 6 for 3ds Max, Update 2 is available for 3ds Max versions 2019 – 2024. All V-Ray subscription plans work for all supported host applications, including 3ds Max, Cinema 4D, Houdini, Maya, Nuke, Revit, Rhino, SketchUp and Unreal.

 

 

Maxon’s Redshift and Cinebench Optimized for Apple’s M3 Chips

Maxon has updated Redshift and Cinebench 2024 to take advantage of the latest developments in Apple’s M3 family of chips. Thanks to these  updated chips, artists can see greater performance when generating photorealistic imagery using Maxon’s Redshift render engine. Maxon’s Cinebench 2024 benchmarking tool, built on Redshift and Maxon’s Cinema 4D, is also now fully optimized for M3 and available as a free download.

Apple’s new GPU architecture features hardware-accelerated raytracing and offers substantial speed increases when rendering with Redshift.

Cinebench 2024.1 is available to download immediately from Maxon.net for anyone to benchmark performance on macOS and Windows hardware.

Optimizations will be available as part of regular Redshift updates for Maxon One and Redshift subscribers in the coming months.

In addition to Redshift, Cinebench and Cinema 4D, Maxon also makes Forger, Red Giant and ZBrush.

 

 

Foundry Ships Nuke 15.0, Intros Katana 7.0 and Mari 7.0 

Foundry has released Nuke 15.0 and will be releasing Katana 7.0 and Mari 7.0. This coordinated approach, says the company, offers better support for upgrading to current production standards and brings enhancements for artists, including faster workflows and increased performance.

According to Foundry, updates to Nuke result in faster creative iteration thanks to native Apple silicon, offering up to 20% faster processing speeds. In addition, training speeds in Nuke’s CopyCat machine learning tool have been boosted by up to 2x.

Mari 7.0’s new baking tools will help artists create geometry-based maps at speed without the need for a separate application. USD updates in Katana 7.0 will minimize the friction and disruption of switching between applications, enabling a more intuitive and efficient creative experience.

Foundry’s new releases support standards across the industry, including compliance with VFX Reference Platform 2023. Foundry is currently testing its upcoming releases on Rocky 9.1 and on matching versions of Alma and RHEL.

Foundry is offering dual releases of Nuke and Katana, enabling clients to use the latest features in production immediately, while testing their pipelines against the latest Linux releases. Nuke 15.0 is shipping with Nuke 14.1, and Katana 7.0 will release along with Katana 6.5. These dual releases offer nearly identical feature sets but with different VFX Reference Platform support.

Foundry is also introducing a tech preview of OpenAssetIO in Nuke 15.0 and 14.1 to support pipeline integration efforts and streamline workflows. Managed by the Academy Software Foundation, OpenAssetIO is an open-source interoperability standard for tools and content management systems that will simplify asset and version management, making it easier for artists to locate and identify the assets they require.

Summary of New Nuke Features:

  • Native Apple silicon support — Up to 20% faster general processing speeds and GPU-enabled ML tools, including CopyCat, in Nuke 15.0.
  • Faster CopyCat training — With new distributed training, its faster to share the load across multiple machines using standard render farm applications to compress image resolution to reduce file sizes for up to 2x faster training.
  • USD-based 3D system improvements (beta) — Improvements include a completely new viewer selection experience with dedicated 3D toolbar and two-tier selections, a newly updated GeoMerge node, updated ScanlineRender2, a new Scene Graph pop-up in the mask knob, plus USD updated to version 23.05.
  • Multi-pixel Blink effects in the timeline — Only in Nuke Studio and Hiero. Users can apply and view Blink effects, such as LensDistortion and Denoise, at the timeline level, so there’s no need to go back and forth between the timeline and comp environments.
  • OCIO version 2.2 — Adds support for OCIO configs to be used directly in a project in Nuke 15.0.

What’s Coming in Katana:

  • USD scene manipulation — Building on the same underlying architecture as Nuke’s new 3D system, Katana will have the pipeline flexibility that comes with USD 23.05.
  • Multi-threaded Live Rendering — With Live Rendering now multi-threaded and compatible with Foresight+, artists can benefit from improved performance and user experience.
  • Optimized Geolib3-MT Runtime — New caching strategies, prevent memory bloats and minimize downtime, ensuring the render will fit on the farm.

What’s Coming in Mari:

  • New baking tools — They cut out the need for a separate application or plugin, so users can create geometry-based maps including curvatures and occlusions with ease and speed.
  • Texturing content — With new Python Examples and more procedural nodes, users can access an additional 60 grunge maps, courtesy of Mari expert Johnny Fehr.
  • Automatic project backups — With regular autosaving, users can revert to any previously saved state, either locally or across a network.
  • Upgraded USD workflows — Reducing pipeline friction, the USD importer is now more artist-friendly, plus Mari now supports USD 23.05.
  • Shader updates — Shaders for both Chaos Group’s V-Ray 6 and Autodesk’s Arnold Standard Surface have been updated, ensuring what users see in Mari is reflected in the final render.
  • Licensing improvements — Team licensing is now available, enabling organization admins to manage the usage of licenses for Mari.

Nuke Trial Extension
With slates and projects being paused across the industry, Foundry is extending its free Nuke 15.0 trial from 30 to 90 days for a limited period. Sign up here.

 

OPPENHEIMER

Oppenheimer VFX Supervisors Talk Explosive Visual Effects

By Iain Blair

Directed by Christopher Nolan, and based on the novel “American Prometheus,” Oppenheimer tells the story of theoretical physicist J. Robert Oppenheimer and follows the work of his team of scientists during the Manhattan Project, leading to the development of the atomic bomb. It stars Cillian Murphy as Oppenheimer.

It also features VFX depicting everything from the first atomic test in the New Mexico desert to physical phenomena ranging from subatomic particles to exploding stars and black holes. As the film’s sole VFX partner, Dneg delivered over 100 shots, crafted from more than 400 practically shot elements, to help create some of the film’s most important and explosive sequences. Oscar-winning production VFX supervisor Andrew Jackson, and Dneg VFX supervisor Giacomo Mineo led the team. Here they talk about creating the VFX and how they did it.

Oppenheimer

Andrew Jackson

What were the big challenges of creating the VFX for this?
Andrew Jackson: One of the biggest challenges, which was also one of the most rewarding aspects of the work, was the set of creative rules that we imposed on the project. We wanted all of the images on the screen to be generated from real photography, shot on film and preferably in IMAX. The process involved shooting an extensive library of elements. The final shots ranged from using the raw elements as shot, through to complex composites of multiple filmed elements. This process of constraining the creative process forces you to dig deeper to find solutions that are often more interesting than if there were no limits.

OPPENHEIMER

Giacomo Mineo

Giacomo Mineo: The movie presented two significant challenges. First, the recreation of the Trinity test and second, the fascinating task of immersing ourselves in Oppenheimer’s mind and figuring out how to capture his ideas and imagination, considering the limited knowledge and visual references available during that era. Concepts like modern physics or the Earth seen from space were relatively new at the time. To truly portray Oppenheimer’s mindset, we had to let go of our modern understanding and delve into his world.

How closely did you work with Chris Nolan? What guidance and input did you get?
Jackson: This is my third film with Chris, and during that time, I have developed a strong understanding of his filmmaking philosophy. His approach to effects is very similar to mine in that we don’t see a clear divide between VFX and SFX and believe that if something can be filmed, it will always bring more richness and depth to the work.

I feel he has a level of trust in my approach to the work, and I really appreciate the freedom he gives me to experiment with ideas and the collaborative approach we take as we refine solutions for the individual shots. As well as having the creative experience from years of working with Chris, Dneg also has a huge benefit when it comes to solving the technical challenges of working with IMAX resolution in a largely optical production pipeline.

Mineo: Chris Nolan was well aware that, without the use of CG, we had a limited set of options. He was flexible and receptive throughout the entire process of exploration and image creation. Whenever he discovered intriguing elements within our tests, he was swift to integrate them into the film and see if he could make them work. This was a really positive and rewarding part of the collaboration.

Also, Andrew played a pivotal role by working closely alongside us, providing the right framework and great creative guidance and bringing his experience and vision to the team.

Is it true Nolan didn’t want to use any CGI? Was that liberating or constraining?
Mineo: This was the challenge set forth by Christopher Nolan, and we embraced it. His vision was that every on-screen image should originate from authentic photography captured on film, preferably in the IMAX format. In pursuit of this goal, we employed traditional techniques such as miniatures, a range of explosions (from massive to micro), thermite fire, long-exposure shots and many more.

The majority of our VFX work revolved around these tangible elements, intentionally avoiding CGI and primarily relying on compositional treatments. This unconventional approach, characterized by self-imposed limitations, had a profound influence on the image-creation process. These constraints compelled us to think innovatively, leading us to creative outcomes that were both distinct and captivating while remaining undeniably rooted in reality.

How many VFX shots are there?
Mineo: There are around 100 VFX shots in the film, plus around another 100 shots that were directly extracted from the vast IMAX library of elements created by Andrew.

How did you recreate the nuclear tests and show the scale of an atomic blast? Break it down for us.
Mineo: For the Trinity test sequence, the goal was to craft an authentically real and awe-inspiring depiction. To achieve this, Andrew and SFX supervisor Scott Fisher embarked on an extensive shoot, capturing a wide spectrum of explosions using IMAX technology. The range included grand-scale detonations featuring various lenses as well as smaller-scale and even underwater detonations. Notably, the billowing dust from the ground and the shockwaves were achieved using small or macro-scale elements.

At Dneg, fully aware of the significance of the task, we began exploring various options right from the first day. We maintained an ongoing dialogue, frequently presenting our preliminary tests to Andrew and Chris Nolan. While archived footage served as inspiration, we allowed for a degree of interpretation, focusing on capturing the essence of the event rather than an exact recreation.

One example is the plasma ball atomic test featured in the high-speed archive footage. To achieve that, we used underwater micro explosions combined with a massive explosion. Subsequently, extensive compositing work was undertaken to seamlessly integrate the elements. Special credit goes to Jay Murray for recreating this iconic moment.

How did you create the crackly rings of fire that Oppenheimer kept visualizing?
Jackson: I built a contraption with multiple spheres, each spinning on different vibrating arcs. These were shot with very long exposures to create the curved, wavy lines.

How did you create elements such as subatomic particles, exploding stars and black holes forming?
Mineo: In the preliminary phase, Andrew dedicated months solely to experimentation at Scott Fisher’s workshop in LA. Armed with his digital camera, he captured a range of tests and presented them to Chris Nolan for review. These tests encompassed a mix of old-style techniques, including miniatures, minute explosions, thermite fire, spinning beads and much more.

Upon Chris Nolan’s approval, the production transitioned to filming in IMAX format. The outcome was a compilation of hundreds of distinct elements. While some seamlessly aligned with the script’s narrative and found their way into the edit, many others contributed to a vast library of elements. Subsequently, for portions of the script still awaiting attention, we embarked on exploring these recorded elements, aiming to complete the work exclusively with this material.

Throughout this process, we discovered that simplicity often yielded the most effective results. However, for instances like the chain reaction or implosion/explosion shots, we employed a diverse assortment of elements, always mindful of preserving the raw authenticity of the footage. Our goal was to maintain the sensation of genuine photography captured on film.

What was the toughest sequence to deal with and why?
Mineo: Without a doubt, the Trinity test was one of the most significant challenges. This was arguably the most complex task of the show, demanding great attention to detail in terms of compositing. It encompassed a range of elements, from massive explosions made in collaboration with Scott Fisher, to relatively small and macro elements shot at the highest frame range IMAX allows.

Examples include tiny, sandy shockwaves and underwater churning dust, to name just a couple. It is worth noting that in the film, the depiction of the explosion from various perspectives was realized through a combination of techniques, including the retiming of practical large-scale explosions and the intricate process of compositing extensive practical elements together. The “wall of fire” shot, designed by Peter Howlett, is an example of that brilliant type of work.

What gear did you use? Any new or cutting-edge systems and methods?
Jackson: We used old-style practical effects methods and techniques, perhaps “cutting-edge” in their era.

You’ve both worked on a lot of huge projects. Where does this rate in terms of complexity and challenges?
Jackson: The approach for this film was so unique that it’s difficult to compare with other projects. Some of the biggest challenges for me were during the shoot. We had a very small IMAX film unit working alongside the main unit in New Mexico, in the winter, in a tent. We needed to move the whole setup every few days, and the weather conditions were less than ideal — we were dealing with snow, freezing water tanks, mud, wind and rain.

Mineo: Undoubtedly, this project stands as the most distinct and extraordinary endeavor I’ve been a part of. We were in full creative mode from the beginning to the end. All the work relied on us embracing the set of rules imposed from the beginning and thinking out of the box.

Our days were really spent looking at the elements, in constant exploration, trying to find something interesting that could be utilized in the movie. The process of continually experimenting with ideas involved creating numerous images, with some eventually making the final cut and others not. A notable illustration of this creative approach is the “birth of the star” shot, designed by Ashley Mohabir. In this instance, we combined a thermite fire element, slowed it down and merged it with a starfield look element coming from the underwater metal particles shoot. The outcome was a striking image that resembled stars and the cosmos.

Images Courtesy of Dneg © Universal Pictures. 


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Emmys: Wednesday‘s VFX Supervisor and Producer

The Netflix series Wednesday is a modern take on the class TV show, The Addams Family. It stars Jenna Ortega and Wednesday Addams, who while attending Nevermore Academy, attempts to master her emerging psychic ability, stop a killing spree and solve the mystery that embroiled her parents 25 years ago. Setting the series’ tone, the first four episodes were directed by filmmaker Tim Burton.

Wednesday

Tom Turnbull

Wednesday was nominated for 12 Emmy Awards, including one for Burton for direction and one for visual effects. VFX supervisor Tom Turnbull and VFX producer Kent Johnson were just two of the team who have been recognized for their work. We spoke to them about the show and its visual effects.

How many shots did each episode have, typically? 
Kent Johnson: They ranged, per episode, from as many as 311 to as little as 97 with an average of 184 shots per episode.

Tom Turnbull: Yes. Roughly 300 shots per episode, which is not super high, but working within a budget and schedule we deliberately focused our resources on shots that count rather than shot count. If a shot did not move the story forward, support the characters or provide emotional impact we did not do it.

What are some of the key VFX in the series?
Turnbull: Thing was the effect that was most important and that most interested me going in. I knew, done right, Thing would be incredibly popular and critical to the character of Wednesday. He needed to be as perfect as we could make him.

Kent Johnson

Nevermore Academy was also a critical, only partially existing as a location, it needed to be created digitally such that is would be accepted as fully real. The creature work, of course, was key and very tricky to pull off on a television budget and schedule.   

Johnson: I agree. The most prominent VFX in the series was likely Thing, the disembodied hand who was usually a 2D effect removing the actor but often a completely 3D CG character. Other effects include set extensions of a castle in the Carpathian mountains of Romania to make it into Nevermore Academy, the CG creatures Hyde Monster, Enid Werewolf, piranhas, Kent the siren/merman and spectacular particle effects in the appearance and destruction of the villain Crackstone.

Did you use virtual production or real-time VFX?
Turnbull: We did not entertain real-time VFX on the show. There was a strong feeling that we needed to ground the look of the show in real-world locations and filmmaking, and that virtual production would not be in keeping with our aesthetic. We did consider virtual production for some driving sequences, but logistically during the height of the pandemic, it was very difficult to orchestrate from Romania. We did use real-time rendering for Nevermore previs with Unreal, allowing us to explore its layout and design and to quickly design shots.

How many different vendors do you use, and what is the turnaround time like? Is it like a traditional television schedule or do you have more time?
Turnbull: There were four main vendors who worked on the complex hero effects and about six or seven secondary vendors who provided support. One of the great things about working with the Netflix model of releasing an entire season on one day is that it allows time to really work the material for early episodes. There is less pressure on hitting a date and some flexibility of dropping in upgraded effects after the mix and color are complete.

Episodes 101 and 102 effectively had eight months to complete, which you would never get on a traditional TV schedule.  This pays dividends in developing looks and procedures that can be applied to later episodes that have a much shorter delivery.   I hope never to see a TV schedule again.  The streaming model provides better creative opportunities.

Johnson: In the end, we relied on 11 different VFX houses. The delivery schedule varied wildly. Some of the more complex sequences took as much as five months from turnover to final delivery while simpler effects were knocked out in days or weeks. Although the turnovers and deliveries were highly fluid, we gave each episode a VFX production schedule of about 100 days in our planning of post.

What about the pipeline? Can you describe it?
Turnbull: On the production side we managed our workflow with Filmmaker databases and spreadsheets. We took the approach that we needed to provide post with as much data and reference as possible, scanning sets, performers and props along with a vast number of digital stills. Managing that much data is a task unto itself. During post we kept a shadow edit on Resolve to manage and assess shot work in context.

Production shot on an ARRI Alexa LF with Signature primes.

Johnson: When we had a locked edit, the editorial department would provide the vendor a QuickTime of the VFX shots in context as a reference. They would then order EXRs of the relevant frames plus 24 frame handles from the post facility using an automated process. The EXRs are posted to the specific vendor’s Aspera accounts for them to download. When the visual effects are approved as final by all of the stakeholders, the final EXRs are then sent from the vendors to the colorist for final grading.

What were the biggest challenges this season? 
Turnbull: To me, the biggest challenge of the season was the sheer volume and variety of the VFX work required. I had worked with Miles and Al before and was familiar with the density and scope of what they put down on the page. It was common to get to page 10 of a script and already be well over what would be considered normal for episodic, both for plot and visual effects. There was no singular effect that I did not have confidence in delivering, it was the number of different effects necessary to tell the story.

There were very few days where VFX was not on-set doing some kind of major effect. It made for a very high-energy, dynamic filming situation, which fortunately, I enjoy. We put a lot of effort into managing the volume of work and in collaboration with Tim and the showrunners, Miles and Al, managed to refine it to its essence. If we had not done that we would have been significantly over budget and the show would have suffered for it.  Less is more as they say.

Johnson: Nevermore Academy was a complex CG asset that required a great deal of time to design, redesign, adapt, model and tweak from shot to shot. The 3D CG Hyde monster was a new creature from the mind of Tim Burton. It required a few different concept artists approaching it from different artistic sensibilities to land on Tim’s vision and then a great deal of time to make such an outlandish creature appear photoreal in both appearance and movement.

 

What were the tools that you used, and why did you choose them for this project? 
Johnson: As the VFX producer, the tools that I personally used were Adobe Acrobat for scripts, Adobe Photoshop to sketch on and annotate tech scout stills, Microsoft Excel and FileMaker Pro for budgeting, DaVinci Resolve to edit Thing’s rehearsals and Adobe’s Frame.io to view previsualizations of virtual drone shots around Nevermore Academy. I’m very pleased that Tim Burton chose to use a physical miniature of the Addams Family house for a flashback where Wednesday buries her deceased pet scorpion in the family’s pet cemetery. Our vendors used Nuke, Maya, ZBrush, Houdini, Adobe After Effects and other software tools.

What was it about this particular episode that made it Emmy consideration worthy? 
Turnbull: Variety. Episode 108 has a bit of everything in it, Thing, Nevermore, Hyde and Enid Werewolf, along with a host of new effects surrounding Crackstone’s incarnation and demise. It also features complex creature work, including transformations, much more than any of the previous episodes.

We were also under considerable time pressure to wrap principal photography, and it was a minor miracle that we managed to get it in the can. A lot of the visual effects work was, as a result, created entirely in post. No one within the Academy voting membership will ever know or appreciate the team effort that went into that, but I do. The biggest achievements are often the ones that are not noticeable.

Johnson: The episode submitted was the Season 1 finale. With almost 300 shots, it showcased all the best VFX of Wednesday, including Thing, Nevermore Academy, the fight between two fully CG creatures of the Hyde Monster and Enid Werewolf. The battle between Wednesday and her nemesis, the pilgrim Crackstone, brought back from the dead and ultimately destroyed with complex dynamic particle effects and finally the poetic defeat of Christina Ricci’s character by a swarm of animated bees controlled by telekinesis.

Haymaker Uses Unreal on CG Spot for Yaamava’ Casino

Viewers are 20 seconds in to the 30-second Nature of Discovery before it is revealed that it is an ad for the Yaamava’ Resort & Casino in San Bernardino, California. The commercial begins in a mystical-looking yet photoreal forest that gives way to The Badlands. The flowers are hearts and clovers, and then a redwood becomes a tower of poker chips without fanfare. A stream leads to a waterfall, and viewers see the reels of a slot machine spinning behind the water. Hints of a casino are everywhere.

Created by Vitro, with production and post by Haymaker VFX, Nature of Discovery was created with a combination of Unreal Engine and traditional render methods.

Haymaker creative director Magnus Engsfors says the obvious challenge was to find material that matched the very specific criteria the client had, such as specific geographic locations, plants and animal species. “We knew right from the start that we would not be able to find these shots, especially not with the right camera angles, lighting and mood. The solution was to create these shots entirely in CG instead. That gave us total control of all the parameters — like camera angles, lighting, reveal effects and more. It also allowed us to involve the agency a lot more, giving them the ability to art-direct not just environments like forests and deserts but individual elements such as individual trees, plants and flowers in a way that would not have been possible if we would solely rely on live-action footage.”

Beyond the creative solutions, Haymaker tried something new in terms of workflow. “Using Unreal, we could make quick changes to lighting and layout and have almost instant updates,” says lead Unreal artist Henrik Skymne. “The quick update speed allowed for a lot more experimentation and fine-tuning. We are not a big studio, which allows us to have artists try out new roles, workflows and software, making sure we are always up to date on the latest techniques as well as very flexible so we can jump on many different types of challenges. The artists are almost all seniors with a great knowledge base, and that makes us efficient in what we do.”

With many shots both created and rendered in Unreal, Engsfors says, “We really challenged ourselves when choosing this path, as the end results had to look photorealistic, and the work created in Unreal also had to work together and blend seamlessly with simulations and other effects created in other software.”

In addition to Unreal, the team also used Autodesk Maya for the animation and rendering of flowers and butterflies, and SideFX Houdini for particle and fluid simulations.

Reflecting on the finished product, Engsfors adds, “It is obvious that tools like Unreal Engine are part of the future of our business, and we will continue to integrate real-time tools in the future. It is already amazing to be able to provide services such as real-time environment creation among other things. We are excited to be at the forefront of technology when it comes to combining ‘traditional’ VFX work with real-time-based solutions. It opens up a whole world of possibilities for working smarter and more effectively together with agencies and clients.”