Tag Archives: Disney

Loki

DP Isaac Bauman on Going Old-School for Loki Season 2

By Randi Altman

The Disney+ series Loki, based on the Marvel character, is back with six new episodes. In this season, Loki and the Time Variance Authority are searching for Sylvie, Ravonna Renslayer and Miss Minutes. It once again stars Tom Hiddleston as the god of mischief.

DP Isaac Bauman joined the Loki team in Season 2, following up on the work of Season 1 director Kate Herron and DP Autumn Durald Arkapaw, ASC. The cinematographer, who has television, film and commercials to his credit, shot five of the six episodes.

Loki

Isaac Bauman

We reached out to him to find out more…

What was it like coming onto a successful series in its second season? Did you follow the look of the first or develop your own? A bit of both?
We reinvented the look of Loki for its second season. Fortunately, we had Season 1 production designer Kasra Farahani and costume designer Christine Wada returning to continue their incredible work, which established a much-needed degree of aesthetic continuity for us to shake things up around.

Season 1 director Kate Herron and cinematographer Autumn Durald Arkapaw knocked it out of the park, so we knew we had a lot to live up to. The thing is, we realized what made their work on the first season so special was how much of their own voices they brought to it.

For long-time admirers of Autumn’s work (such as myself), it was immediately visible in the cinematography that this was her voice — no one else’s. And that’s what made the cinematography feel so fresh and exciting. Autumn did her thing, and we were all the better for it. But to continue with the rightfully acclaimed approach from the first season would have done a disservice to the show.

I’m not Autumn, and in fact, I’d say our individual bodies of work display remarkably little stylistic overlap. We are very, very different artists with entirely different interests and preferences.

Same deal with Justin Benson’s and Aaron Moorhead’s philosophy as directors. They’ve developed their own voice, their own approach to the craft. It’s theirs, and it has very little overlap with any other filmmakers I’m aware of.

To step into Loki and imitate the work of our predecessors felt like it would be a mistake. When we decided that certain things had to change, we realized that everything had to change. We had to develop an entirely new approach from the ground up.

At every step of the process, we had the support and feedback of our open-minded, brilliant executive producer Kevin Wright, as well as the whole gang at Marvel HQ. We felt 100% supported, and we are so deeply grateful for that.

How would you describe the look of this season, and how did the showrunners tell you what they wanted?
Tasteful, mature, elegant, organic and immersive. I developed the look alongside lead directors and executive producers Justin Benson and Aaron Moorhead. They’ve made half a dozen indie films and have developed a very refined style together. They came in knowing a lot of what they wanted to do this season —specifically in regard to camera movement and framing — and I brought a lot of the lighting ideas into play. It was a true collaboration.

We switched from “studio mode” filmmaking (dolly, crane, remote head, Steadicam, etc.) to doc-style hand-held photography. I’ve traditionally been a studio-mode guy myself, but Justin and Aaron love hand-held and wanted the sense of immediacy, naturalism and immersiveness that you can only ever get from hand-held. They were 100% right, and I love what the raw, energetic, hand-held approach brings to the often polished world of the MCU. We also did a lot of zooms for a ’60s/’70s thriller aesthetic and to break up the hand-held work.

Loki

We changed the aspect ratio from 2.39 to 2.20. In my mind, 2.39 is not wide — it’s actually narrow — and those thick black bars on the top and bottom of the screen feel like wasted canvas. 2.20 maintains that feeling of “wide” cinematic scope while allowing for 10% more vertical compositional space.

Another thing we changed was eliminating the use of colorful lighting. You’ll notice in the first season that there are many sequences — usually at least one in each episode — that are light, with saturated and colorful sources. In our season, there are none. The idea was to homogenize the palette — really limit the scope of the range of looks as well as tools used — to create more of a feeling of aesthetic cohesion and discipline.

Did you shoot using the same kit as Season 1?
We switched from Sony Venice to the ARRI Alexa Mini LF. I like the Venice a lot, but the Alexa is still the champ, in my opinion. There is something about how it renders movement and motion blur.

We switched from anamorphic to spherical lenses. This is a personal preference. I find that anamorphics look stunningly gorgeous but less immersive and immediate than the more matter-of-fact spherical optics.

We switched from top-of-the-line Panavision lenses to what are essentially prosumer-oriented budget lenses: the Tokina Cinema Vistas. We tested just about every set of large-format spherical lenses Panavision and ARRI Rental London possessed, and the Tokinas best fit the look we were after, even after a lot of incredulous double-checking. The proof is in the pudding.

We switched from the longer focal lengths that are inherent to anamorphic cinematography to using (very) wide lenses almost exclusively. These cameras got right up in our actors’ faces all day, every day. Cheers to the cast for using matte box/tape eyelines and never complaining about it.

Did you have a DIT? If so, how did that help?
Jay Patel was our DIT, and he was an enormous help. Because of the complicated VFX pipeline, it was necessary to limit ourselves to a single LUT. That would’ve been tricky without Jay. Using CDLs, we made little tweaks to scenes and individual shots all the time. It’s important to go into editorial with the most final-looking image possible, and Jay and I worked as best we could to deliver that.

Loki

Matt Watson

Did you work with the colorist on the look? What is an example of a note you gave to the colorist?
We worked with Matt Watson, who works full-time as a colorist at Marvel Finishing. He played a huge role in developing the look of the season. First, working together, he and I developed a film emulation LUT. In the first season, they embraced a fresh, cutting-edge aesthetic, whereas we were more interested in a vintage, throwback look — as if Loki was shot on the same film stock as 2001: A Space Odyssey.

We also added pretty heavy in-camera and digital/post-filtration (in addition to the heavy level of haze present on the sets) to make the image feel as soft and smoked-out as we could. Matt contributed to these efforts as well, developing a proprietary diffusion filter for us in Resolve, and it really made the images sing. And we added a hell of a lot of 16mm grain, which Matt massaged into the image, often on a shot-by-shot basis.

You touched on this a bit earlier, but you chose to go hand-held when Loki goes through timeslips. Why was that the way to go? What view does that give the audience?
The idea behind the hand-held, in general, was to put the audience into the scene. We wanted the photography to be as immersive as possible, and hand-held felt like the best way to achieve that.

What about the lighting? You went old school with tungsten rather than the newer LEDs. Why was that?
In the first season, nearly everything was LED. For example, the iconic Chronomonitor Wing set (the main area where they watch the timeline monitor) was lit with SkyPanels. In Season 2, we switched to ARRI Arrilite 2000s. Again, this was to achieve a more convincing vintage aesthetic.

There are a lot of reasons — and I could talk about how much I love tungsten all day — but the main reason was the production design. If Kasra’s sets so lovingly evoke a late ‘60s/early ‘70s aesthetic, then why shouldn’t the lighting?

All these older films we love the look of (like 2001) used tungsten, so we did too. You can see the difference in how much warmer the TVA feels this season; it’s very rich and golden.

Can you walk us through the challenging World’s Fair scene?
The World’s Fair was a massive build that took many weeks and a huge construction crew. Due to the size of the set, the scope of the lighting became quite large as well. On shooting days, we had over 100 set lighting technicians with us.

The idea was to key the scene with light motivated by the Ferris wheel. Where the Ferris wheel would be added later on in post, we had an array of Wendy Lights — an old school, very powerful, multi-headed tungsten unit. You can think of a Wendy as the big, industrial-sized brother of a Maxi Brute. We had a large array of those — large enough to accurately emulate a source as large as a 130-foot-tall Ferris wheel. It was suspended from a large construction crane, with the bottom side roughly 60 feet in the air. That provided the key and a feel of directionality that carried down the entire depth of the approximately 180-foot-long set.

For fill light, we had three 40-foot by 40-foot overhead softboxes equipped with Vortex8 LED units suspended from construction cranes out over and along the center of the set. There were 20- to 30-foot gaps between the softboxes, but on camera they feel very much like a continuous source.

Around the outside of the set, we had 12 by 20 “letterbox” softboxes on Manitou telehandlers pointed down into the set at about 45 degrees. We used these for edges or to dig fill in at a lower angle where necessary.

There were also more than a thousand practical bare tungsten bulbs built into the set, which provided some fill. And there was the need for an insane amount of distro.

Loki features a number of visual effects. How did that affect your shooting, if at all?
Generally, VFX sequences are fairly easy from a DP perspective. You frame the shot you want, you light it the way you want, you add interactive lighting effects as instructed by the VFX supervisor, and you never let the characters drift off the bluescreen. That’s about it.

The most involved part is planning the lighting. If you’re shooting a bluescreen sequence and only have a very rough understanding of what the VFX world will look like eventually, it is essentially on the DP to determine what the lighting should be. Often in VFX sequences, the DP will be indecisive or want to allow VFX the most flexibility to determine lighting, so that’s why you get a lot of these CGI set pieces looking so flat, gray and wishy-washy.

The trick is to really plant your flag and make strong, decisive choices about the direction, intensity and color of the light in your VFX sequence. Fortunately, I had the full support and, more than that, the encouragement of VFX supervisor Chris Townsend in that approach.

What was the most challenging part of the series for you?
The biggest challenge on a production this size is maintaining consistency in the look. Shooting dozens of sets on a half dozen stages, working with a large ensemble cast and a crew of hundreds over the course of 18 weeks. Trying to unify the look of the show so footage shot on day 1 cuts seamlessly with something shot on day 90 and has a strong, unique sense of style… that’s the challenge.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 25 years. 

Mermaid

Color and Collaboration on Disney’s The Little Mermaid

By Alyssa Heater 

The Little Mermaid, Disney’s latest live-action adaptation of the animated classic, uses color to convey a “sense of reality” in both underwater and above-water worlds. Michael Hatzer, supervising digital colorist/VP of creative color finishing at Picture Shop, was enlisted to grade the feature, detailing to postPerspective how the process consisted of two simultaneous workflows to achieve the end result.

Mermaid

Michael Hatzer

Hatzer’s relationship with cinematographer Dion Beebe, ACS, ASC, spans over 20 years, with the two first working together on Equilibrium in 2002. Since then, the two have collaborated on multiple films including Gangster Squad and Mary Poppins Returns. Explains Hatzer, “Dion is fantastic to collaborate with. He creates such beautiful images, which provide a great canvas for us both to build upon.”

Beebe also has a longstanding relationship with director Rob Marshall, dating back to Memoirs of a Geisha for which he won the Academy Award for Best Achievement in Cinematography in 2006. The two brought Hatzer in to grade Mary Poppins Returns then tapped him again for The Little Mermaid, when the team worked out of Picture Shop’s New York facility.

How did this collaboration work? Hatzer got involved early in the preproduction stage to help establish looks. After Beebe selected his camera equipment and lens packages, in this case the ARRI Alexa 65 with Hawk65 anamorphic lenses, he shot a multitude of tests — lens, camera and on-screen followed by hair and makeup. Beebe and Hatzer then set looks, with Hatzer creating LUTs for the various settings — day exterior, night exterior, on-set, off-set — to offer a multitude of options to enable the dailies to look as similar to the final results of the DI as possible. From there, Beebe presented the tests to Marshall and his team for final approval.

Two Simultaneous Workflows
Preproduction on The Little Mermaid began close to four and a half years ago, but due to the pandemic, filming was postponed for a year. When regulations eased, the principal photography was captured on-set at Pinewood Studios in London.

In addition to Marshall, Beebe and Hatzer, editor Wyatt Smith, ACE, served as another key collaborator, overseeing the VFX and underwater sequences. Together, they needed to come up with a solution to integrate the above-water scenes into the VFX scenes that took place underwater, which were shot against a bluescreen. They devised two simultaneous workflows to handle the complexity.

Hatzer provided Beebe with two LUTs: one for each workflow. Those LUTs would then go to the VFX house for the VFX team to use to create visual effects aligned to the color. Picture Shop color scientists Josh Pines and Chris Kutcka designed the ACES workflow that streamlined the thousands of visual effects shots in this feature.

Mermaid “It was a very complicated film, but everyone was so well organized,” says Hatzer. “From the first day of shooting, my team worked in conjunction with the camera and editorial departments. We came into the project very prepared on how to deal with the two different color spaces and the thousands of VFX shots that were coming in hourly during the DI process.”

How Color Sets the Tone
In contrast to the 1989 animated version, the filmmakers wanted the look of this retelling to feel much more realistic. While Hatzer did reference the original to familiarize himself with the story and take note of the saturation, color palette, hair and beyond, he explains that the filmmakers looked elsewhere for inspiration. “We actually referenced David Attenborough documentaries, like Planet Earth.”

Marshall and Beebe were extremely prepared and knew exactly how they wanted each unique world to look. The level of darkness or brightness was motivated by where the characters were in relation to the depth of the ocean. In Ursula’s lair, which was located deep beneath the surface, the saturation and levels of blackness would change. When Ariel goes to the Shipwreck Graveyard and encounters the shark, it again becomes a deeper world. When the mermaids were closer to the surface, it became brighter and more colorful.

“The motivation is that Rob wanted a sense of reality, so as the characters went deeper into the ocean, the light would get darker,” explains Hatzer. “That was one major difference between this and the animated version, which was consistently bright. He wanted to keep everything grounded in a sense of reality, so it didn’t have an overly saturated, cartoonish look.”

Color and VFXMermaid
Hatzer credits FilmLight’s Baselight system in helping streamline the color process on this massive VFX undertaking. Because the VFX were constantly being updated, and mattes were not used for the various characters, Baselight enabled Hatzer to draw multiple windows around characters, roto and track them, and bring up the color very subtly.

Through use of Baselight, Hatzer could augment the VFX grade without breaking the VFX. When the VFX shots go out, there are always changes. For example, Sebastian the crab might need to be redder. Rather than sending shots back to VFX, which would delay the process by two days, Hatzer could draw a window and bring up Sebastian to look redder.

“Working on the Baselight was incredibly indispensable in tackling this huge Disney tentpole movie. It could handle the two different workflows, it allowed me to manage the multiple VFX coming in and the different versions of each VFX, and I could set up different timelines for the editor.”

The New York-Based Color Suite
Marshall and his team prefer to work out of New York on their projects, and The Little Mermaid was no exception. Picture Shop New York managing director Thomas Centrone and director of engineering Ahmed Barbary were critical in optimizing the color suite, which was outfitted with new 2D screens, a removable 3D screen and a Barco 4K projector. Editor Everette Webber and assistant colorist Kevin Schneider joined Hatzer in New York for two and a half months to work on the film.

Picture Shop colorist Alex Durie helped with the HDR/SDR home video passes, and the team brought in colorist Jonah Braun for help on the RealD 3D elements. Together, they tackled multiple color passes, including the normal 2D xenon pass and a normal projection 3D pass. They regrouped at the Dolby Theater headquarters in Midtown Manhattan for the 2D Dolby laser pass then returned to LA for the 3D Dolby laser pass on the Disney lot.

When asked what made this a memorable experience, Hatzer says, “For me, it’s the personal interaction and the creative process of working with these amazing filmmakers. Rob Marshall and Dion Beebe are such a pleasure to work with, and they’re true professionals. It’s always great to see the process from the very first day grading your first shot all the way to the end. It’s like watching your child grow. Every day, it’s getting stronger and more robust and becoming more of a finished product. We’re all really satisfied with the overall look of the film and had a fantastic time working together on it.”


Alyssa Heater is a writer working in the entertainment industry. When not writing, you can find her front row at heavy metal shows or remodeling her cabin in the San Gabriel Mountains.

Tashi Trieu

Avatar: The Way of Water Colorist Tashi Trieu on Making the Grade

By Randi Altman

Working in post finishing or 10 years, colorist Tashi Trieu also has an extensive background in compositing as well as digital and film photography. He uses all of these talents while working on feature films (Bombshell), spots (Coke Zero) and episodics (Titans). One of his most recent jobs was as colorist on the long-awaited Avatar follow-up, Avatar: The Way of Water, which has been nominated for a Best Picture Oscar.

Tashi Trieu

We reached out to Trieu, who has a long relationship with director James Cameron’s production company Lightstorm Entertainment, to learn more about how he got involved in the production and his workflow.

We know James Cameron has been working on this for years, but how early did you get involved on the film, and how did that help?
I was loosely involved in preproduction after we finished Alita [produced by Cameron and Jon Landau] in early 2019. I was the DI editor on that film. I looked at early stereo tests with the DP Russell Carpenter [ASC], and I was blown away by the level of precision and specificity of those tests.

Polarized reflections are a real challenge in stereo as they result in different brightnesses and textures between the eyes that degrade the stereo effect. I remember them testing multiple swatches of black paint to find the one that retained the least polarization. I had never been a part of such detailed camera tests before.

What were some initial directions that you got from DP Russell Carpenter and director Cameron? What did they say about how they wanted the look to feel?
Jim impressed on me the importance of everything feeling “real.” The first film was photographic and evoked reality, but this had to truly embody it photorealistically.

Avatar: The Way of WaterWas there a look book? How do you prefer a director or DP to share their looks for films?
They didn’t share a look book with me on this one. By the time I came onboard (October 2021), WetaFX was deep into their work. For any given scene, there is usually a key shot that really shines and perfectly embodies the look and intention Jim’s going for and that often served as my reference. I needed to give everything else that extra little push to elevate to that level.

Did they want to replicate the original or make it slightly different? The first one takes place mostly in the rain forest, but this one is mostly in water. Any particular challenges that went along with this?
Now that the technology has progressed to a point where physically based lighting, subsurface scattering and realistic hair and water simulations are possible on a scale as big as this movie, the attention to photorealism is even more precise. We worked a lot on selling the underwater scenes in color grading. It’s important that the water feel like a realistic volume.

People on Earth haven’t been to Pandora, but a lot of people have put their head underwater here at home. Even in the clearest Caribbean water, there is diffusion, scattering and spectral filtering that occur. We specifically graded deeper water bluer and milked out murkier surface conditions when it felt right to sell that this is a real, living place.

This was done just using basic grading tools, like lift and gamma to give the water a bit of a murky wash.

The film was also almost entirely a visual effect. How did you work with these shots?
We had a really organized and predictable pipeline for receiving, finalizing and grading every shot in the DI. For as complex and daunting as a film like this can be, it was very homogeneous in process. It had to be, otherwise it could quickly devolve into chaos.

Every VFX shot came with embedded mattes, which was an incredible luxury that allowed me to produce lightning-fast results. I’d often combine character mattes with simple geometric windows and keys to rapidly get to a place that in pure live-action photography would have required much more detailed rotoscoping and tracking, which is only made more difficult in stereo 3D.

Did you create “on-set” LUTs? If so, how similar were those to the final look?
I took WetaFX’s lead on this one. They were much closer to the film early on than I was and spent years developing the pipeline for it. Their LUT was pretty simple, just a matrix from SGamut3.Cine to something just a little wider than P3 to avoid oversaturation, and a simple S-Curve.

Usually that’s all you need, and any scene-specific characteristics can be dialed in through production design, CGI lighting and shaders or grading. I prefer a simpler approach like this for most films — particularly on VFX films, rather than an involved film-emulation process that can work 90% of the time but might feel too restrictive at times.

WetaFX built the base LUT and from there I made several trims and modifications for various 3D light-levels and Dolby Cinema grades.

Tashi Trieu

Park Road Post

Where were you based while working on the film, and what system did you use? Any tools in that system come in particularly handy on this one?
I’m normally in Los Angeles, but for this project I moved to Wellington, New Zealand for six months. Park Road Post was our home base and they were amazing hosts.

I used Blackmagic DaVinci Resolve 18 for the film. No third-party plugins, just out-of-the-box stuff. Resolve’s built-in ResolveFX tools keep getting more and more powerful, and I used them a lot on this film. Resolve’s Python API was a big part of our workflow and streamlined shot-ingest and added a lot of little quality-of-life improvements to our specific workflow.

How did your workflow differ, if at all, from a traditionally shot film?
Most 3D movies are conversions from 2D sources. In that workflow, you’re spending the majority of your time on the 2D version and then maybe a week at the end doing a trim grade for 3D.

On a natively 3D movie that is 3D in both live-action and visual effects production, the 3D is given the proper level of attention that really makes it shine. When people come out of the theater saying they loved the 3D, or that they “don’t” have a headache from the 3D and they’re surprised by that, it’s because it’s been meticulously designed for years to be that good.

In grading the film, we do it the opposite way the conversion films do. We start in 3D and are in 3D most of the time. Our primary version was Dolby Cinema 3D 14fL in 1.85:1 aspect ratio. That way we’re seeing the biggest image on the brightest screen. Our grading decisions are influenced by the 3D and done completely in that context. Then later, we’d derive 2D versions and make any trims we felt necessary.

Tashi TrieuThis film can be viewed in a few different ways. How did your process work in terms of the variety of deliverables?
We started with a primary grading version, Dolby Cinema 3D 14fL. Once that was dialed in and the bulk of the creative grading work was done, I’d produce a 3.5fL version for general exhibition. That version is challenging, but incredibly important. A lot of theaters out there aren’t that bright, and we still owe those audiences an incredible experience.

As a colorist, it’s always a wonderful luxury to have brilliant dynamic range at your fingertips, but the creative constraint of 3.5fL can be pretty rewarding. It’s tough, but when you make it work it’s a bit of an accomplishment. Once I have those anchors on either end of the spectrum, I can quickly derive intermediate light levels for other formats.

The film was released in both 1.85:1 and 2.39:1, depending on each individual theater’s screen size and shape to give the most impact. On natively cinema-scope screens, we’d give them the 2.39:1 version so they would have the biggest and best image that can be projected in that theater. This meant that from acquisition through VFX and into the DI, multiple aspect ratios had to be kept in mind.

The crew!

Jim composed for both simultaneously while filming virtual cameras as well as live action.

But there’s no one-size-fits-all way to do that, so Jim did a lot of reframing in the DI to optimize each of the two formats for both story and aesthetic composition. Once I had those two key light-levels and framing for the two aspect ratios, I built out the various permutations of the two, ultimately resulting in 11 simultaneous theatrical picture masters that we delivered to distribution to become DCPs.

Finally, what was the most memorable part of working on Avatar: The Way of the Water from a work perspective?
Grading the teaser trailer back in April and seeing that go live was really incredible. It was like a sleeping giant awoke and announced, “I’m back” and everybody around the world and on the internet went nuts for it.

It was incredibly rewarding to return to LA and take friends and family to see the movie in packed theaters with excited audiences. It was an amazing way for me to celebrate after a long stint of challenging work and a return to movie theaters post-pandemic.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 25 years. 

Disney and Avid Team on Marvel Cinematic Uiniverse Edit Library

Disney Studios collaborated with Avid to transform the offline Marvel Cinematic Universe (MCU) editorial archive into a real-time editorial library accessible by current productions and marketing teams. The system is a prime example of implementing key elements of the MovieLabs 2030 Vision to deliver an innovative archive experience with a secure, cloud-based search capability that streamlines production and marketing workflows.

The MCU-EL (Marvel Cinematic Universe – Editorial Library) is built on a private cloud available to authorized Marvel Studios users around the globe. The solution customizes and extends existing applications to bring years of editorial footage, representing the vast Marvel Studios cinematic canon, into a searchable library for production users. The new editorial library turns what was a manual, multi-

participant, time-consuming offline search to find reusable assets into a self-service, real-time experience for users with zero manual work by editorial support personnel. It uses custom product extensions to Media Composer and other Avid products developed in a way that allows Avid to productize the solution for the entire industry.

As of this writing, the Marvel Cinematic Universe contains more than 30 different titles, all of which are part of the same universe of characters, themes, locations and materials. Historically, all the material, whether in the final version of the film or not, was sent to archive for potential future use. Editorial and marketing teams working on new titles, tasked with maintaining overall MCU continuity across a growing number of titles, regularly needed to reference or reuse editorial material from previous works.

Getting access to that archived material took up to two weeks, with involvement by production or marketing teams, editorial support teams, archivists and Marvel Studios’ internal Avid engineering support. Even with the process streamlined over time, it still required five working days and involved the production team, one or more editors, a post supervisor and one or two Avid engineers. After each request, all the works were discarded until the next request arrived.

Now a full week of work has been reduced to a self-service capability that is immediately available. Today the system is available to authorized internal teams through a web interface with no down time and no people required to support individual requests.

The MCU-EL:

– Makes all MCU historical materials available for use in new films and shows on a secure, real-time basis.

– Provides robust search and discovery of editorial library assets directly to authorized users looking to reuse archived materials.

– Automates the manual tasks required to source, prepare and deliver the archival assets.

– Maintains continuity and quality of Marvel Studios films by making better use of historical materials.

– Enables a non-proprietary capability that could be used by the whole industry to support franchise,

episodic and other kinds of productions that would benefit from access to reusable libraries.

– Deploys the solution with best-in-class security that meets demanding Marvel Studios standards.

By empowering editors to directly access the MCU-EL, Marvel Studios has sped up new production edits. The system has replaced reliance on group knowledge with a stable searchable solution that provides immediate search results.

The system makes massive volumes of historical material efficiently available to production and marketing teams at Marvel Studios, increasing their ability to maintain quality and continuity across the growing number of MCU titles. Ingesting and tagging a new title takes about one week, and 14 of the 30-plus Marvel Studios titles have been made available in the system so far in under a year of operation.

To build the MCU-EL, the team repurposed or extended existing Avid applications and services, taking advantage of Avid capabilities built for broadcast management of large news libraries. The team worked with Avid to adapt those tools to support the vast volume of clips, subclips and assets available in the MCU archive, as well as the complex web of relationships between them. The MCU-EL system allows people to access every sequence, source clip and subclip in the Marvel Studios archive through a web browser, ingest the material into a central repository, search it remotely, and deliver it back to storage. After finding and selecting media for reuse, an editor simply right-clicks “send to show,” and the media is placed in Media Composer project bins for the identified production.

The generic MCU-EL capability can be deployed to support other franchises and is available today inside Disney for deployment on other Disney franchises, such as Star Wars, etc.

MCU archive materials are stored centrally in an archival Avid Nexis shared storage system. Users browse and find materials using the standard Avid Media Central UI (exposed via a cloud UX interface). Retrieval requests trigger the system to build a remote library with the requested materials. A new capability, called Avid MediaCurrents, manages the transfer of assets and associated metadata from archival storage into the remote library, which then can be managed through Media Central Asset Management or transferred to Media Composer working bins through MediaCurrents. The solution maintains metadata preserving the relationships between source materials and subclips so that only the smaller subclips must be transferred from storage for editor search and discovery.

Avid engineers created the new code required for the solution so that Avid could make the generic capability available to all of its customers in the film and television ecosystem.

2030 Principles

The MCU-EL aligns with the MovieLabs 2030 Vision principles in multiple ways:

Principle 1

The MCU-EL demonstrates the benefits of moving assets from slow physical storage systems (e.g., tape) to rapid access media storage where files can be securely made available to downstream users and processes. This perpetual access to the entire library enables the ongoing curation of the files, rapid access via search and also unlocks new technology such as Artificial Intelligence-based indexing and optimizations of the library which were not easily achieved with offline libraries.

Principle 4

The MCU-EL is a perfect example of the intent of Principle 4 – moving from legacy archiving infrastructure to the power of ‘libraries’ of organized, searchable materials from previous Marvel Studios productions, going back more than ten years. The entire system has transformed Marvel Studios’ highly curated and secure offline repository into a cloud utility with access granted on a per-title basis to users in certain roles.

Principle 8

The system incorporates a metadata structure built on a comprehensive ontology that cross-references and interrelates assets to make them searchable for users in new and powerful ways. Assets are organized, described, and interrelated using the Disney Content Genome ontology, augmented by automated tagging using machine-learning. The system demonstrates the advantages of a creative and robust search capability with a standardized ontology and metadata structure. In addition, the system couples qualitative content data with technical data about asset formats, dates and times of capture, and relationships and linkages between assets, allowing efficient recovery and repurposing of the assets without extensive offline research.

Principle 10

The MCU-EL implements an entirely new self-service, real-time workflow that completely replaces a time-consuming and resource-intensive process that previously required multiple manual iterations to respond to user feedback. This demonstrates the intent behind Principle 10 – that new technologies can modify processes that took weeks and transform them into real-time interactions enabling creatives to make rapid decisions and iterate in real time.

Growing Up

DP and Colorist on Disney+ Docuseries Growing Up

New York’s Nice Shoes provided post finishing for Growing Up, the new hybrid docu-drama series from creators Brie Larson and Culture House, now streaming on Disney+. Senior Colorist Sal Malfitano collaborated with cinematographer Christine Ng in finalizing the look of the show, which tells the true stories of young people taking on the challenges, triumphs and complexities of adolescence. Nice Shoes ultimately delivered final color in both DolbyVision and SDR for each episode.

Growing Up

Cinematographer Christine Ng

Each of the 10 half-hour episodes tells the coming-of-age story of a single individual “hero.” The stories are told through interviews, archival media and creative dramatizations of pivotal moments in their past.

The show’s emotional and unique storytelling style are what drew Ng to the project. “The opportunity to help visualize the struggles of these young people was exciting,” she says. “And the show’s format presented interesting creative opportunities, especially through its use of magical realism. Making those moments feel emotional, evocative and visually immersive was demanding and fun.”

Ng captured most of the show with ARRI Alexa Mini and Amira cameras but shot certain flashback scenes with a Super 8mm film camera to give them a different textural quality. She worked with production designer Emmeline Wilks-Dupoise and lighting designer Alexa Mignon Harris to develop textured color schemes and moody lighting to enhance moments of magical realism and dreamy interviews.

“We used color to explore different themes for each character,” she explains. “We employed golden tones to convey hope and darker, red tones when subjects were going through hard times. Interview segments were shot against a circular backdrop that gave them a dreamy effect, as if we were entering the person’s subconscious. The color palette was peachy and soft.”

That sophisticated color was refined during post grading sessions at Nice Shoes. Ng recalls that Malfitano, who worked on Blackmagic DaVinci Resolve, immediately understood her aesthetic. “I shared the look book I created when I pitched the show with Sal so that he got the vibe,” she recalls. “We then talked through the show, episode by episode, and broke it down into its components. The cinema verité parts have a modern, clean, digital look while the recreations are more colorful and saturated.”

Malfitano says that the varying color treatments help guide the viewer throughthe show’s changes in perspective and shifts in time. “We used color in subtle ways to distinguish the narrative scenes from the memory flashbacks,” he explains. “We left the interview segments flatter and less contrasty.”

He adds that the grading sessions were intense and detailed but also very rewarding. “It was essential that I was live grading with Christine in the room,” he says. “There were so many formats and styles of visual storytelling that we needed to work through each scene together. We weren’t starting from scratch, but we looked for every opportunity to make it better.”

Ng says that she is thrilled with the show’s finished look, and she credits that to the emotional investment she and Malfitano brought to it. “Sal has kids, and he looked at these stories from a parent’s point of view,” she notes. “He could relate to what these young people were going through. He has an empathic lens. That’s the same way I feel when I pick up a camera. I want the viewer to feel they are immersed in the story.”

 

Andor

Andor Editor/Producer John Gilroy Talks Post Workflow

By Iain Blair

Andor, the new live-action Disney+ Star Wars series, is a gritty thriller rife with political intrigue and high stakes – along with plenty of hoverbikes, laser guns and other impressive visual effects. Set five years before Rogue One: A Star Wars Story, the prequel explores a new perspective from the Star Wars galaxy and focuses on Cassian Andor’s journey to become a rebel hero.

Andor

John Gilroy at the series premiere

Editor John Gilroy, ACE, whose credits include Michael Clayton, Pacific Rim, The Bourne Legacy and Rogue One: A Star Wars Story, edited four episodes and was a co-producer on the series.

I spoke with Gilroy about the editing challenges and the post workflow on the show.

How closely did you work with your brothers — creator/showrunner Tony Gilroy and Dan Gilroy, who wrote several episodes? It’s like a family business!
You’re right. I’ve cut every project Tony and Dan have ever done, and I also cut for my dad, the late Frank Gilroy. The only person I haven’t cut for is my mother, who is 94, so it’s not too late. Maybe one day she’ll get a directing gig, and I’ll cut it.

(Laughs) For me, it’s so great working with both my brothers. There’s a similarity and sensibility we share, and a shorthand. There’s a lot that doesn’t have to be said.

What were the main challenges of this show, especially in terms of post and VFX?
Just like with a feature film, we began on post and all the VFX right at the start, and it was a very big challenge in terms of it being a much bigger canvas and arena. I’ve been on big movies editing for a year, but this was like doing four movies in two years. It was huge, with a lot more people on the post team and a lot more editors – seven in addition to me. You work the same way as a movie, but you delegate more, and there are a lot of moving parts.

Tell us about the workflow.
We wanted to go in sequence for Season 1 because we were trying to find the show, and I was working with Tim Porter [ACE], one of the other editors, but COVID kind of messed us up. I was working in New York with Tim for the first six months of shooting, but at the same time, we had the entire department working at Pinewood in England, where the production was based. They were building these huge sets and were shooting a lot of location stuff.

So you march along, they start the next block, and then you come in with your notes after the director’s cut and make the changes. And you’re always overlapping with one or two or more blocks at a time, so there’s a lot of place-setting.

AndorThey put a lot of money into this, but not as much as in a big feature, so the way you make up for that is that the TV schedules are longer. All the VFX shots took a very long time, but we had more time to get it all right. It’s the same thing with the sound. The sound crew is smaller, but all the prep is far more spread out.

What was the show shot on?
The total amount of footage shot in Britain was 275TB of media, and it was mainly shot on the Sony Venice 1 at 4K and with a 6×5 aspect ratio (4096×3432). We used Panavision C-Series anamorphic lenses. For our secondary camera, we used the ARRI Alexa Mini LF. And we used an ACES workflow.

AndorWhat about the edit?
We cut on Avid Media Composer Version 2018.12.8,  and the project format was 1080p/24. The offline codec was DNxHD115. Editorial storage ran off a 40TB Avid Nexis shared storage unit and we added another 20TB chassis on completion.

Where was the post done?
Mostly in London, and eventually I moved over there. At the start, my crew were all based at Pinewood, next to the shoot. Then when I moved over, we moved the whole show to Hireworks in Soho. That was our base of operations, and it’s where we did all the post for Rogue One. I actually ended up cutting in the same room I cut Rogue One in, which was great.

Then we did all the sound at Northern California’s Skywalker, and we had two sound supervisors: Dave Acord, who was our sound designer and mixer, and Margit Pfeiffer, who did all our ADR and dialogue. She was actually with me in London, which was very helpful. Most of the sound crew was at Lucasfilm Skywalker, and that worked out really well. They’d do their work and send us files which we’d integrate. We mixed all the shows in London, and they’d come over for the sessions at De Lane Lea.

There are a lot of VFX. How closely did you work with ILM VFX supervisor Mohen Leo and his team?
We met them on Rogue One. They did a great job, and it was a very close collaboration. It’s such a joy to work with them as their expectations of their own work are even greater than ours, and that makes our job far easier because we don’t have to continually monitor the process.

What they know about Star Wars’ history and lore is so deep. Basically, they made the show look like a movie, and it all comes down to clarity of vision and knowing exactly what you want to achieve. You don’t get a lot of second chances in TV because of the tight schedules and budget.

Andor

The scripts have to be really good and the production has to be really plugged into making a really interesting shooting strategy because you don’t get to do it a million different ways on the day. So in editorial you’re also hugging that initial plan and have to cut fairly quickly. Big movies often get found in post with a lot of changes going on, but there’s none of that here. Everyone’s marching forward in sync.

You must have used a lot of temp VFX?
Oh yeah – a ton. We had a great VFX editor, Liyana Mansor, who was with us from the very start. I don’t know how to do that work myself, but I pull in experts like Liyana, and we used a lot of temp VFX to help us figure stuff out and sometimes to shape the mechanics of a scene. So it was a big part of the whole process.

Tell us more about the DI.
Our colorist was Jean-Clement Soret, and we conformed in FilmLight Baselight with UHD SDR and HDR timelines. Tony, who was in New York, would come over and was very heavily involved in the DI.

How would you sum up the whole experience?
I have a lot of experience and a lot of tricks up my sleeve, and I used all of that on this show. It was a very high standard and a very big production, and I think we really pushed the envelope. We’ll be starting the next season soon, so it’s very exciting.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Hawkeye

Rising Sun Provides VFX For Hawkeye‘s Pizza Dog and More

Rising Sun Pictures produced nearly 200 visual effects shots for the new Marvel Studios series Hawkeye, which recently debuted on Disney+. The studio’s work included the reproduction of an actual New York City neighborhood visible outside an apartment where much of the series takes place. Rendered in extraordinary detail, the urban exterior is seen in daytime and nighttime contexts and under varying weather conditions, including a snowstorm.

Based on the Marvel comic, Hawkeye stars Jeremy Renner as Clint Barton/Hawkeye. who with the help of a young protégé, Kate Bishop (Hailee Steinfeld), confronts enemies from his past who turn up in New York City just before Christmas. The series, which also stars Vera Farmiga, Fra Fee, Tony Dalton, Zahn McClarnon, Brian d’Arcy James and Alaqua Cox, is helmed by Rhys Thomas and the directing duo Bert and Bertie.

Visual effects for the series were led by VFX supervisor Greg Steele and VFX producer David Masure-Bosco. RSP’s team was headed by VFX Supervisor Dennis Jones, compositing supervisor Neill Barrack, CG supervisor Kieran Ogden-Brunell, VFX producer Amy Tinker and EP Meredith Meyer-Nichols.    

Hawkeye gave RSP the chance to apply its expertise in seamless visual effects. RSP’s principal task was to replicate a section of New York City visible from Kate Bishop’s Aunty Moira’s apartment and place it into the windows of a practical set used in the live-action production. The digital environment is comprised of 2D and 3D elements and includes dozens of buildings and other landmarks familiar to New Yorkers. “We provided more than 250 degrees of coverage, working from photographic data captured at the practical location,” explains Barrack. “We formed it into a cyclorama where we could place a camera and shoot from any angle and match any movement. We added cars, lights, people and other details appropriate to each scene where it appears.”

Hawkeye

The urban backdrop was used in three of the show’s six episodes. Compositors adjusted lighting and added atmospheric effects to match the time of day and weather conditions of specific scenes. “We added snow flurries to all the nighttime shots by adding hundreds of thousands of snowflakes,” says Barrack. “We also added water droplets and condensation to the nighttime windows.”

Intended as an invisible visual effect to viewers, the background environment had to be applied with finesse and attention to detail. “The challenge was to make the cityscape look photographic and consistent throughout,” observes Barrack. “The environment is used many times across multiple episodes, and it has to look the same each time. If two characters are having a conversation and the camera is switching back and forth, the view behind them has to be coherent and consistent.”

Hawkeye

RSP was also tasked with an effect related to one of the series’ most beloved characters, the pizza-loving golden retriever Lucky. Artists replaced one of the dog’s eyes to make it appear sewn shut. The challenge here, says Barrack, was in making the digital element feel like a natural part of the real dog. “The dog’s behavior is unpredictable and involves a lot of micromovements,” he explains. “The eye area is of course covered in fur; all that detail has to be reproduced as the dog shakes its head and moves around.”

RSP’s CG department contributed to a scene in which Clint teaches Kate a novel trick for turning on a television by flipping a gold coin. The scene suddenly shifts to extreme slow motion, and the coin flips in bullet time. “We ran the animation at the 96fps supplied and matched it to a background plate that was shot at a high frame rate,” explains Ogden-Brunell. “We worked in slow motion, which gave us the ability to control the animation, then precisely matched the editorial curve. We could bring it back to real time at any point.”

Presented with a long shot list and firm deadlines, RSP developed novel ways to work quickly without sacrificing quality or artistic integrity. Barrack’s compositing team, for example, employed a modular workflow that allowed artists to work with the same shots without interfering with each other’s work, and they could quickly generate versions without requiring complete rendering. “That helped us stay ahead of the curve,” recalls Tinker. “We delivered much of the work before deadline. It was all very streamlined.”

The studio used Nuke, Maya, Houdini, Arnold and Katana to create the VFX.

 

 

Marvel's WandaVision

Color and Post Pipeline: The Many Looks of Marvel’s WandaVision

By Randi Altman

If you’ve watched Marvel’s WandaVision, you can imagine that the production and post workflows were challenging, to say the least. The series, streaming on Disney+, features multiple time periods, black-and-white footage, color footage, lots of complex visual effects and the list goes on.

Colorist Matt Watson

Colorist Matt Watson and the in-house Marvel post team had their hands full and creative hats on. To help with any challenges that might come up, Watson got involved from the very beginning — during camera tests — and that, along with being on set, proved to be invaluable.

“I’ve set up several Marvel shows with a DI toolset and mindset,” reports Watson. “Having that resource on set at the start of a production really helps to find answers to color pipelines and workflow direction, and WandaVision had many challenges.”

WandaVision was a complex show to set up from both a color and pipeline perspective, so we reached out to Watson to find out how he and the Marvel team prepared and tackled the show.

Can you talk about the many post challenges on WandaVision?
Not only was this a show with looks from different eras and with multiple aspect ratios, but it was also Marvel’s first “HDR first” show — meaning HDR monitoring on set, in dailies and finishing — so it was really important to nail down a pipeline and workflow for all departments.

The Brady Bunch look

Of course, the success of this kind of undertaking comes down to the communication. Camera, DIT, dailies, editorial, Marvel’s plates lab, VFX and finishing… everyone had particular wants and needs, so being there and having these conversations was incredibly important. Communication is one of those ingredients that aid the success of Marvel productions.

Post supervisor Jen Bergman was great at getting all the teams together and talking through all potential problems. Evan Jacobs, Marvel’s creative finishing supervisor, was there during the setup to offer his vast experience with Marvel productions, as was Mike Maloney, Marvel’s imaging guru. The knowledge pool available for the show setup was inspiring.

How did you work with WandaVision DP Jess Hall (ASC, BCC)?
I spent time with Jess working on dialing in the LUTs for use. We had seven different looks in total, which had to be converted to both HDR and SDR. For me, being there in person really helps streamline and evolve this initial creative starting point, as I can see exactly what Jess was lighting on set and what his intentions were.

The Brady Bunch look

There was one instance testing a Brady Bunch look — Jess was trying to light with a blue ambience to create a color contrast with a warm key, but the first iteration of this LUT did not fully realize the subtleties of the light. So thanks to being there on set and being able to look at the nuances of Jess’ lighting with my eyes, I was able to retreat to my DI room and dissect the LUT. I added more filmic crosstalk in the tone curves and color gamut with some deeper, saturated primary colors to further separate the cooler shadows and warmer highlights, something very synonymous with 35mm film color reproduction. Jess was happy with the revised LUT, and that’s something that would have been so difficult to dial in had I not been there in person.

From my experience, dialing in the look early is so beneficial, particularly with VFX-heavy shows. VFX will live with these decisions for a long time and fine-tune their work. Any big swings later can risk breaking the compositing work.

It helps a lot to be on the ground in the early stages of production. There are lots of other things I typically get involved with. I tend to help shape the dailies pipeline, in collaboration with Marvel plates lab, and I’ll also try to establish the color grading environment so that the DP and anyone visiting the dailies room will get the best experience with our dailies colorists.

The Family Ties look

As you mentioned earlier, this show has a few different looks. Can you talk about the feel of the modern-day scenes versus the period ones?
Creatively, there’s a warmer palette inside Wanda’s world, where she tries to hold on to her happiness. This is in contrast to the cooler, more tragic reality outside in the “real world,” which is really based on the look and feel of the larger Marvel Cinematic Universe (MCU). Much of that is achieved by design in lighting, production design and costumes.

To further separate them visually, the modern-day scenes maintain a clean, natural feel, leaning into the photography. In contrast, the period looks lean into a stylization and texture, which included varying image degradation, such as grain, defocus, bloom, chroma misalignment, gate weave, etc. We reserved all this image degradation work to the DI so we could dial it in to taste. But again, this was all to complement the already amazing work done in camera. The lighting, lens choice and production design were all so well-researched and recreated on set that we had to make sure we were enhancing what was there in the “negative.”

Marvel's WandaVision

The classic MCU look

HDR also played a part in the design of these different worlds. HDR is this huge sandbox of brightness and color gamut, so we spent time trying to figure out how much of this HDR canvas we wanted to use for these period looks. To maintain the photographic quality of a 1950s print or an NTSC telecine video transfer from the 1980s, we had to limit the available dynamic range HDR can allow. But for the modern-day MCU material, we could lift the lid and expand on what HDR can offer.

In the end, our period looks were limited to between 150-300 nits’ peak brightness, and for the MCU look, we settled on 600 nits. Jess felt the 600 nit represented a highlight level that still felt filmic but really showcased the photography. Really, all this exploration was vital so that Jess could light successfully in HDR and had complete control and representation of what the final HDR image would look like.

Can you talk about working with the period looks, which resemble classic TV sitcoms?
They are all based on classic TV shows that had been a part of Wanda’s life, shows like The Dick Van Dyke Show, Bewitched, The Brady Bunch, Family Ties, Malcolm in the Middle and Modern Family. But visually for us, they were just guides; we wanted to lift elements of each to help tell Wanda’s story. For some episodes, we leaned more into the degraded quality of the original, but less for others.

The transition to color is complete

For instance, Episode 1 has quite a heavy degradation treatment to really set the audience up in this alternate TV fantasy world. But we also didn’t want the audience to become fatigued from heavy, soft, grainy images, so Episode 3 really pushes the color separation of shows like The Brady Bunch, but with minimal textural degradation. A lot of these decisions were made in DI, when we were able to see multiple episodes back to back and get a sense of how the flow of the series worked.

Most of these decisions were made collaboratively by Jess Hall, Tara DeMarco and Evan Jacobs and presented to our director, Matt Shakman, before finally being presented to the studio. This way, everyone had a chance to offer their opinion and have a balanced, thoroughly considered image.

You were essentially dealing with two different shows in one.
It was more like 10 shows in one! We not only juggled the HDR and SDR element of each look, but some episodes had multiple looks, with transition from one to another. In order to make this work, we used the ACES framework to manage our technical transforms (to HDR and SDR, for instance) and converted all our creative looks to LMTs. This avoids us being stuck under a single LUT for an episode and gave us the most flexibility.

Marvel's WandaVision

The Dick Van Dyke Show look

But even with this framework, we still had so many complex shots. Two that spring to mind are the closing shots of Episodes 1 and 3. In Episode 1, we have Wanda and Vision sitting down in their 1950s look that pulls back to reveal our modern day. In Episode 3, we have that classic black-and-white-reveal-into-color shot. Both were fantastic shots, but for them to work, we had pretty complex node structures in our coloring software, Blackmagic DaVinci Resolve. We used mattes holding out one LMT from one part of the shot, while another LMT was activated using another matte, while also carrying degradation in part of the shot. The resulting node graph in our software looked horrifying. But the implementation worked so well and gave us full flexibility.

What was WandaVision shot on? Can you talk more about how much of the look was established on set/in dailies?
WandaVision was shot almost entirely on an ARRI Alexa LF. Being on set during the beginning of the shoot gave me the opportunity to work with the DIT, Kyle Spicer, and our dailies colorist, Cory Pennington. We all worked together to calibrate what tools and controls to use at the front end so that I could take that work and continue in DI.

Marvel's WandaVision

The Dick Van Dyke Show look

We tried to limit the CDL controls to offset only and slope when needed, as it’s pretty much the same as printer points in my finishing world. I find that if a CDL is aggressively handled, it’s hard to integrate it into a structured color pipeline in the finishing suite, so I spend time remaking it. Those guys did a great job. We communicated regularly, and in the end, they laid fantastic groundwork in the DI, which meant we could focus on more of the details in DI.

What was your reference for the B&W segments? Assuming it was all shot color and made B&W in your suite.
Yes, it was shot all in color — it had to be because there were a lot of visual effects that had to take advantage of a color negative for color keys, etc. This is something a black-and-white negative would have turned into a massive rotoscoping exercise.

In Episode 1, the reference for the black-and-white look was The Dick Van Dyke Show. For Episode 2, it was Bewitched. Having the color negative was vital to the look. The production design used the same period colors that were used on The Dick Van Dyke Show. From there, we were able to build two desaturation matrices that mixed the color channels of the negative to the finished monochrome image.

The Bewitched look

From there, we emulated a print process that included a film tone curve, a warmer D55 white point, film print defocus and film grain. We then further added some telecine/analog video transfer degradation. We went pretty heavy with Episode 1. We treaded much lighter to separate the two black-and-white eras on Episode 2. This had a cooler D60 white point and a grain and defocus more akin to an interpositive film print.

How does being in-house at Marvel help in the overall process?
It offers so many advantages to the finishing process. I think the greatest advantage is that we’re working on the show exclusively. We can dedicate so much time to really developing the look and color grading. Whether it’s at the start in photography stage, during the edit to explore how the grade can help the storytelling, in conjunction with VFX to find the best solutions to the numerous challenges, or, obviously, finishing the show at the end. We can really help the whole process alongside the production, rather than jamming everything in at the end.

Marvel has also developed an incredible internal tool, appropriately called JARVIS, that connects the VFX, editorial and finishing databases. It can perform some incredibly advanced data wrangling and heavy lifting that would ordinarily take a lot of time and manual work to complete.

Marvel's WandaVision

Bridging two worlds

While that alone sounds cool (at least to me it does), the real benefit is the speed with which we can create and update cuts and sequences. We can have VFX send a list of shots they want to see and seconds later have a timeline built, complete with CDLs, previous VFX versions, underlying main plates, etc. We can go from shot requests to DI in minutes; it’s amazing.

You’ve touched on the VFX a bit, but can you discuss how you worked with that team?
The collaboration between departments on this show was so amazing. Our lead VFX supervisor, Tara DeMarco, had an insane amount of work, and we really wanted to help out where we could. We ended up holding regular grading sessions with the VFX team. They could come to us at a moment’s notice to look at shots in DI and to establish what they needed to work on and what could be left to DI.

Marvel's WandaVision

Vision VFX

A great example was the balancing of Vision’s skin color throughout the series. Traditionally, this would have fallen to VFX to meticulously go through and balance the color of Vision’s skin while creating the CGI head — a real challenge when multiple VFX vendors were working on Vision. So instead, we had VFX generate mattes for Vision’s head, crown and infinity stone, and we took on that responsibility in DI. Being able to relieve VFX of that task so they could focus on other, cooler stuff only helped to benefit the show.

It’s sometimes difficult for VFX supervisors and their vendors to interpret and turn around specific color notes. So we would often create “sketches” for the VFX supervisors, a DI grade where they can play with color live to sketch in the direction they want their vendor to go. This was particularly true for the hex wall. We would often review VFX in the DI so we could more fully explore what that story point might look like. We’ve found it has been very useful for VFX to have their own color previz ability, and it saved valuable time when communicating their intent to the multiple VFX vendors they work with.

Vision VFX

In addition to the traditional challenges on a show of this scope, you worked during COVID as well, correct?
The finish took place during the pandemic, so the entire DI was completed remotely. Marvel sent out calibrated LG C9s to Matt Shakman, Jess Hall and Tara DeMarco, and I have one here accompanying my Sony BVM-HX310. This gave me and the creative teams the flexibility to start or join a DI at any point and the confidence that we were seeing comparable images.

For the bulk of the grade, I would work here in Disney Studios in Burbank, and Jess would join me remotely when he was available. We could have a 5-minute DI or a 5-hour DI, whatever was required at the time. The flexibility was fantastic.

We also ran sessions like this for editorial and VFX. Out of the COVID restrictions have come this great ability to be flexible, which has turned into a fantastic, valuable tool.

What If …?

What have you and the Marvel team been working on recently?
I’ve since finished Marvel’s Loki series. Travis Flynn, our other colorist, finished the The Falcon and the Winter Soldier, and we’ve also collectively finished the What If …? animation series.

There’s a lot more coming down the pipe. We’re still a small department, but because of all the efficiencies that have been made internally, we’re able to achieve so much here. Since WandaVision, we can now run live HDR sessions between the dailies hub in Atlanta and our suite here on the Disney lot in Burbank. With a few button clicks, I can be looking and advising our dailies colorists in Atlanta, or I can be running sessions from here for filmmakers in Atlanta. The technical tools that are being built and deployed here create more time and paths to be creative. It’s a really exciting future.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

The Falcon and the Winter Soldier Director Kari Skogland Talks Post, VFX

By Iain Blair

Kari Skogland is a prolific and successful director of TV dramas. Her long list of credits includes The Handmaid’s Tale, Boardwalk Empire, The Walking Dead, The Americans and House of Cards.

No wonder Marvel tapped her to direct and EP the globe-trotting action-adventure The Falcon and The Winter Soldier, a six-part Disney+ series that takes place following the events of Avengers: Endgame. With odd couple Sam Wilson (Anthony Mackie) and Bucky Barnes (Sebastian Stan) taking on timely issues, including racism, the result is “an epic, character-driven story,” says Skogland. “We get to go inside these characters and their world in a much more intimate way. If the movies were a snack, this six-hour series is the meal. And it has all of the wonderful things that come with the MCU — action, comedy, a high-octane pace, familiar faces and new characters. It’s incredibly relatable.”

I talked to Skogland about making the series as well as the post and visual effects. Her creative team included DP P.J. Dillon, ASC, ISC;VFX supervisor Eric Leven; and editors Jeffrey Ford, Kelley Dixon, Todd Desrosiers and Rosanne Tan.

This is full of huge action scenes and cinema-level VFX that look like they came straight out of the Captain America films. Did you study up on all that?
Absolutely, and I also had the whole Marvel team. Everyone had worked in that space in one capacity or another and had all this expertise, which really helped with my steep learning curve.

Can you talk about integrating the post and VFX from day one?
I wanted to front-end-load all the VFX shots, so that Eric and his team could get working on them right away. We did a lot of previz and stunt viz with The Third Floor, so everyone was on the same page, and you’re also honing the shots and sequences early on. I love previz because it’s like moving storyboards. I storyboard with a team, and then they’re turned into previz, so we can see what we have and either expand on it or cut back on stuff. Then we integrate the stunt viz and the stunt team’s choreography. At the end of all that, you have a great roadmap, with all the minutiae and detail — and that gives you more freedom to try stuff and discover stuff on the day on set.

We were in lockstep with all the VFX teams, and they were part of every decision. It was all carefully planned out because I believe in as much prep as possible. I also love to find the inspiration, so even in these huge action sequences, it’s important that, in the moment, you allow for things you didn’t expect, that you find the magic you can’t plan for.

How tough was the shoot?
It was tough since we were doing a six-hour movie, but our schedule was like a Marvel two-hour movie — that was quite a challenge. What also added to the challenges was the huge scope and scale and wanting it to feel international — shooting overseas — and bringing a cinema feel to it … but I had all the toys and equipment we needed. We started shooting in October of 2019, and we had shot about 75% before production shut down due to COVID. Then we began post and came back later.

Tell us about post. Was it remote because of COVID?
Yes, it was all remote, and everyone was at home and connected via Zoom. In some ways it was very efficient, but in other ways it was difficult, and we all missed being in the editing room together. We did the sound mix at Skywalker, and that was also remote. I was at Technicolor in Toronto, and everyone had high-end headsets, and that worked well.

You had four editors cut this.  How did you all work together?
We had three editors during production — Kelley Dixon, Todd Desrosiers and Rosanne Tan — and they each had two episodes, as they were cutting concurrently while we were shooting. After shooting each day, I’d work with one of the editors at night to get the scenes mapped out and, of course, they’d be working through the day. That way I’d get time with each editor, and I could see if we needed any pickups or anything else.

Then Jeff Ford came on as the supervising editor once we got into post. He’s worked on many Marvel movies and shows, so he has a real sense of them. He was like the guru of post. He’s amazing and really brought the scenes alive and worked very closely with the other three, so it was a team effort.

What were the main editing challenges?
Tone and pacing were critical on such a complex project, but we also had a plethora of choices in each scene since we’d done some ad lib and improv, so we had to really calibrate all that as well — particularly with John Walker and trying to nail his journey and who he is. When I shot all that, I wanted to have options in post, as I very much believe that post is your one big last opportunity to essentially redirect the picture. We really calibrated all that and tried all sorts of options in post as we went. And then, as with any show or movie, you look at all of it and go, “Oh, maybe that scene would work better somewhere else.” So there is some reimagining of the story as you go along and discover new things all the time, and we had the time to do that and experiment. We also had the benefit of Jeff Ford’s fresh eyes when he came on for the second stage of post. For instance, it was totally his idea to start it all off with the opening shot of a guy ironing. That was genius, as it draws you in, and in such a lovely way. Here’s this guy doing what he has to do, and then suddenly we’re in this big action sequence.

There are a ton of visual effects. Can you talk about working with VFX supervisor Eric Leven?
Eric came on board right at the start, and he and I started planning out all the action sequences, set extensions and so on. It was literally just me, him and production designer Raymond Chang in the office. That was critical, as Eric and I worked in tandem, and Eric has wonderful ideas and can also pivot very quickly on those ideas. We’d go down some roads and then abandon them and then rethink them. He also helped us a lot with all the previz, and he was able to take some half-baked ideas and run with them while I was busy elsewhere. He elevated every one of the action sequences.

We began with storyboards, and Eric turned those into previz, which gives you a first look at the potential of the sequence. He’d be there through the whole pipeline through to the final, so he’d oversee everything and all the vendors [including Trixter, Crafty Apes, Technicolor VFX, Cantina Creative, Weta, ILM, SPI and Rodeo].

What about the DI?
It was done at Marvel’s own new DI suite and all remotely, except for some execs here in LA who could be in the room. I had a feed at Technicolor in Toronto, and the DP had one as well, and we could look at the actual DI and color-correct accordingly. They did a first pass, and then we did some tweaks and another pass, and we’d bounce in and out of some of the shows because we were still adding VFX shots as we went. Given that it was all remote, it worked well, but there’s no substitute for being in the room — especially in terms of editing — but it means that post can evolve and that you can be remote and get great results.

Fair to say this is the most ambitious project you’ve ever directed?
Oh yes, for sure! It’s beautifully complex and embraced all the themes I love to work with in my projects and topics that I think are important. Plus, it has huge action sequences and great character development, so it hit all the bases. It wasn’t just extremely challenging, it was also very satisfying.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

WandaVision Director Matt Shakman Talks Post and VFX

By Iain Blair

After conquering the global box office, the Marvel Cinematic Universe (MCU) has turned its attention to streaming and the small screen — both figuratively and literally — with WandaVision, an inspired and inventive blend of classic television and the MCU. The show marks Marvel Studios’ first Disney+ series and its also its first journey into the world of sitcoms — with a Marvel twist, of course.

(L-R): Director Matt Shakman directing Elizabeth Olsen and Paul Bettany

The story centers on Avengers Wanda Maximoff (Elizabeth Olsen) and her android husband, Vision (Paul Bettany), who find themselves living idealized suburban lives — but in a surreal, ever-evolving alternate universe that is cleverly modeled on various sitcoms through the eras. It begins in the style of black-and-white 1950s shows and moves through the decades, updating everything from costumes and sets to visual language and VFX as it goes.

“It’s a mash-up of classic sitcoms and large-scale Marvel action,” says director/EP Matt Shakman, who directed all nine episodes. “I think it’s really lovely that the first streaming show from Marvel Studios — producer of huge blockbuster films — is really a love letter to the history of television.”

I talked to Shakman — whose credits include Game of Thrones, The Great (for which he was Emmy-nominated) and It’s Always Sunny in Philadelphia — about post, VFX and making the show with a creative team that included DP Jess Hall, ASC, BSC; production designer Mark Worthington; and VFX supervisor Tara DeMarco.

What were the big technical challenges of directing the show?
To render all the shows perfectly, we did a lot of research into lighting, cameras, lenses as well as costumes and the acting styles. So we watched a lot of old sitcom episodes and then got very specific about the palette for each era. Then as we moved through those eras, the production design, props, cars and so on would all change, and we all worked very closely with the VFX and special effects teams.

Because it was this huge production, we started on all the post and VFX immediately, as they were such a key part of each episode. For instance, when things would fly through the air in the ‘50s or ‘60s eras, we’d use jump cuts and wires, liked they used to. And we’d freeze Elizabeth, have her stand-in copy the exact position while she ran off to change clothes, then she’d come back and stand in the same position. That’s how she’d magically transform from dress to dress, which is exactly how they’d do it on Bewitched. Then we’d use VFX to take out the wires and smooth out the jump cuts.

So for the first few episodes, it was a great mix of old-fashioned effects and the latest technology. Then as the story progressed, the VFX got more and more complex and important. So by Episode 6, when Vision disintegrates, we had Rodeo doing these amazing effects on his skin, as well as on the Hex, which pulls bits from him — everything from particle simulation to smearing effects [created with tools including Houdini, Katana, Maya, Arnold and Mari].

Can you talk about working with DP Jess Hall to get all the different looks.
We began at Pinewood Studios in Atlanta and shot two-thirds of it, but then we had to break because of COVID. We ended up finishing the final third in LA. We shot it all on the ARRI Alexa because it creates such a beautiful image, and we knew we’d be shooting different eras, often in the same day, in difficult weather, so it made no sense to keep changing the camera setup.

Then we added a lot of vintage lenses and filters, and Jess ended up using close to 50 different lenses, including some custom-modified to his needs. Aspect ratio was also important, so we used 4:3 for the older shows and then gradually moved into 16:9. Then there’s the real world outside, which had to feel very real, with real weather. So we specifically used the same Ultra Panatar lens package from Avengers: Infinity War and Endgame to ensure we carried on the continuity. That also underscored the contrast when we jumped inside the perfect world of the sitcoms.

For lighting, we used vintage tungsten for the early shows, and then moved on to LEDs. Post was also crucial in terms of the looks, and we had an amazing color team at Marvel who dialed in the look for each era and how much grain we wanted. The main goal was to always be as authentic as possible, so we shot the first two episodes in black-and-white and the first in front of a live studio audience. The early ‘50s black-and-white has a sepia tinge to it, while the ‘60s black-and-white has more grays.

Tell us about post. Where did you do it?
We did the editing and all the finishing and color at Marvel and all the sound at Skywalker.

You had three editors on the series – Zene Baker, Nona Khodai and Tim Roche. How did that work?
I’d worked with Tim a lot on It’s Always Sunny in Philadelphia, and Zene and Nona have done it all — from big superhero films to comedy and drama. They all brought so much experience as well as different strengths, which this show needed, as it jumps around so much from comedy to drama, period to the present. That was true of all the team.

They each took point on three episodes each, but when COVID hit and our post schedule got very tight, they’d all jump in to help out with any problems. I’d say the main editing challenges were dealing with all the rapidly shifting tones, styles and rhythms and incorporating all the VFX.

There are a lot of VFX. Can you talk about working with VFX supervisor Tara DeMarco, who was at The Mill for a long time?
The VFX play a huge role, and we had over 3,000 shots and over 20 vendors, including ILM, Digital Domain, Rodeo, MARZ, Luma, Weta, Screen Scene and others. Tara was involved in every conversation, as the VFX had a wide-ranging influence on everything from the cinematography to production design and editing — and even costumes and makeup. For instance, we had to figure out how Vision would look in black-and-white, and it turned out that after many tests, red was not the best color. So he’s actually blue in those early episodes, and you’d never know. Tara was one of the busiest people on the whole show.

And you don’t have the luxury of a longer schedule like on a movie.
Exactly, so you have to be very well-prepared. I had three storyboard artists, and we worked very closely with The Third Floor to do previz animation from my storyboards, and then we fine-tuned that, and that served as a guide at all the production meetings as we began peeling the onion and figuring out how to do each shot. Then we did a lot of postviz with The Third Floor, as it’s impossible to know what you’ve got without having some rough idea of the VFX that’ll ultimately be there, and it can be hugely helpful in shortening the time needed by the VFX vendors that will bring the sequence to life. It means fewer questions and less R&D, and it’s a great tool for making post run very smoothly.

What about the DI? 
Evan Jacobs runs the brand-new color department at Marvel, and colorist Matt Watson, who’d worked on a lot of Marvel projects, recently came over to join the team full-time. They’re amazing and have such a great visual eye. They worked very closely with Tara, Jess and me in R&D’ing so many different looks to make sure each era was super-specific, accurate to the era and visually interesting.

All that work began before we even started shooting and continued all the way through production and post. We were able to get a lot of work done during the lockdown, but then there was a mad dash to finish it all once we’d gone back and completed the shoot. And there were several firsts on this: It’s the first Marvel show finished all in-house at the new DI facility, and the first finished using HDR files. Looking back, this was such a challenging project, but I’m very proud of what we were able to accomplish.

Finally, it’s a fine line between paying homage to classic sitcoms and just parodying them, a line you must have been very aware of as a child star in famous sitcoms like Just the Ten of Us and Webster.
Yes, and our goal was never to spoof or parody them, but to authentically recreate them— homage is exactly the right word. And it’s also Wanda’s creation. She has the power of chaos magic, so the world of sitcoms she loved as a kid had to be perfect.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Weta Digital adds Joe Marks as CTO

Wellington, New Zealand’s Weta Digital has added Joe Marks as chief technology officer. Marks has more than 35 years’ experience as a technology executive with extensive background in graphics, computer science and research and development. He will oversee Weta’s technology initiatives that span visual effects as well as the newly announced Weta Animated division. Marks will report to Weta Digital CEO Prem Akkaraju.

Marks joins Weta Digital from Carnegie Mellon University, where he served as executive director of the Center for Machine Learning & Health, working at the forefront of innovation in digital healthcare. Marks is best-known for his roles at Mitsubishi Electric Research Labs (MERL) and Disney Research where he orchestrated and led the R&D organization in the media and entertainment industry with labs in Pittsburgh, Zurich, Boston, San Francisco and Los Angeles. His research interests span CG, video processing, media distribution, robotics, HCI, mobile computing, computer vision, AI, sensors, embedded systems and behavioral economics.

“The addition of Joe Marks allows us to go deeper and push further into the technical innovations that will define the next-generation of storytelling, says senior visual effects supervisor Joe Letteri. “His ability to unite artificial intelligence initiatives with real-time technology and increasingly procedural workflows will allow us to fully realize the promise of the virtual production workflows we pioneered for the first Avatar.”

“While working with The Walt Disney Company and the ACM SIGGRAPH community, I admired Weta for many years. Weta’s proprietary tools have long set the standard in simulating appearance and behavior in virtual worlds,” says Marks. “I’m excited about how we will build on this base: the incorporation of techniques from AI and machine learning will broaden the scope of what can be simulated; new display and camera technologies will enable us to connect the virtual world to the real world in exciting new ways; and cloud computing will allow us to do it all more efficiently.  I am looking forward to combining my own experience with that of industry veterans like Joe Letteri and Prem Akkaraju to broaden the cinematic palette for the great directors that work with Weta.”

Marks has been active in the SIGGRAPH community serving as its Papers Chair in 2004 and Conference Chair in 2007 and has a broad portfolio of peer-reviewed publications in applied computing.  He has also managed over $100M in corporate and academic R&D spend in addition to co-founding two e-commerce start-ups.

Marks holds an A.B. in applied mathematics and a PhD in computer science, both from Harvard University.

“A pivotal part of the future of VFX and animation rests in artificial intelligence, machine learning and cloud computing. Joe’s deep experience in these areas makes for a perfect fit at Weta as we continue to push the boundaries of our industry,” says Prem Akkaraju, CEO of Weta Digital. “Joe’s experience leading research teams both inside and outside our industry will bring new ideas and methodologies to our existing state-of-the-art pipeline. It’s a privilege to bring someone of Joe’s stature to double down on one of the core strengths of the company.”

VFX supervisors talk Amazing Stories and Stargirl

By Iain Blair

Even if you don’t know who Crafty Apes are, you’ve definitely seen their visual effects work in movies such as Deadpool 2, La La Land, Captain America: Civil War and Little Women, and in episodics like Star Trek: Picard and Westworld. The full-service VFX company was founded by Chris LeDoux, Jason Sanford and Tim LeDoux and has locations in Atlanta, Los Angeles, Baton Rouge, Vancouver, Albuquerque and New York, and its roster of creative and production supervisors offers a full suite of services, including set supervision, VFX consultation, 2D compositing and CG animation, digital cosmetics, previsualization and look development.

Aldo Ruggiero

Recently, Crafty Apes worked on two high-profile projects — the reboot of Steven Spielberg’s classic TV series Amazing Stories for Apple TV+ and the Disney+’s Stargirl.

Let’s take a closer look at their work on both. First up is Amazing Stories and Crafty Apes VFX supervisor Aldo Ruggiero.

How many VFX did you have to create for the show?
The first season has five episodes, and we created VFX for two episodes — “The Heat” and “Dynoman and the Volt!!” I was on the set for the whole of those shoots, and we worked out all the challenges and problems we had to solve day by day. But it wasn’t like we got the plates and then figured out there was a problem. We were very well-prepared and we were all based in Atlanta where all the shooting took place, which was a big help. We worked very closely with Mark Stetson, who was the VFX supervisor for the whole show, and because they were shooting three shows at once, he couldn’t always be on set, so he wanted us there every day. Mark really inspired me just to take charge and to solve any problems and challenges.
What were the main challenges?
Of the two episodes, “Dynoman and the Volt!” was definitely the most challenging to do, as we had this entire rooftop sequence, and it was quite complicated, as half was done with bluescreen and half was done using a real roof. We had about 40 shots cutting back and forth between them, and we had to create this 360-degree environment that matched the real roof seamlessly. Doing scenes like that, with all the continuity involved and making it totally photo-real, is very challenging. To do a one-off shot is really easy compared with that, as it may take 20 man-days to do. But this took about 300 man-days to get it done — to match every detail exactly and all the color and so on. The work we did for the other episode, “The Heat,” was less challenging technically and more subtle. We did a lot of crowd replacement and a lot of clean-up, as Atlanta was doubling for other locations.

It’s been 35 years since the original Amazing Stories first aired. How involved was Spielberg, who also acts as EP on this?
He was more involved with the writing than the actual production, and I think the finale of “Dynoman and the Volt!!” was completely his idea. He wasn’t on the set, but he gave us some notes, which were very specific, very concise and pointed. And of course, visual effects and all the technology have advanced so much since then.

Gabriel Sanchez

What tools did you use?
We used Foundry Nuke for compositing and Autodesk Maya for 3D animation, plus a ton more. We finished all the work months ago, so I was happy to finally just see the finished result on TV. It turned out really well I think.

Stargirl
I spoke with VFX supervisor Gabriel Sanchez, a frequent collaborator with Wes Anderson. He talked about creating the VFX and the pipeline for Stargirl, the musical romantic drama about teenage angst and first love, based on the best-selling YA novel of the same name, and directed by Julia Hart (Fast Color).

How many VFX did you have to create for the film, and how closely did you work with Julia Hart?
While you usually meet the director in preproduction, I didn’t meet Julia until we got on set since I’d been so busy with other jobs. We did well over 200 shots at our offices in El Segundo, and we worked very closely together, especially in post. Originally, I was brought on board to be on the set to oversee all the crowd duplication for the football game, but once we got into post, it evolved into something much bigger and more complex.

Typically during bidding and even doing the script breakdown, we always know there’ll be invisible VFX, but you don’t know exactly what they’ll be until you get into post. So during preproduction on this, the big things we knew we’d have to do up front were the football and crowd scenes, maybe with some stunt work, and the CG pet rat.

What were the main challenges?
The football game was complex, because they wanted not just the crowd duplication, but also to create one long, seamless take because it’s the half-time performance. So we blocked it and did it in sections, trying to create the 360 so we could go around the band and so on.

The big challenge was then doing all those cuts together in a seamless take, but there were issues, like where the crowd would maybe creep in during the 360, or we’d have a shadow or we’d see the crane or a light. So that kind of set the tone, and we’d know what we had to clean up in post.

Another issue was a shot wherein it was raining and we had raindrops bouncing off a barn door onto the camera, which created this really weird long streak on the lens, and we had to remove that. We also had to change the façade of the school a bit, and we had a do a lot of continuity fixes. So once we began doing all that stuff, which is fairly normal in a movie, then it all evolved in post into a lot more complex and creative work.

What did it entail?
Sometimes, in terms of performance, you might like a take of how an actress speaks her lines technically, but prefer another take of how an actor replies or responds, so we had a lot of split screens to make the performance come together. We also had to re-adjust the timing of the actors’ lip movements sometimes to sync up with the audio, which they wanted to off-set. And there were VFX shots we created in post where we had no coverage.

For instance, Julia needed a bike in front of a garage for a shot that was never filmed, so I had to scan through everything, find footage, then basically create a matte painting of the garage and find a bike from another take, but it still didn’t quite work. In the end, I had to take the bike frame from one take, the wheels from another and then assemble it all. When Julia saw it, she said, ‘Perfect!’ That’s when she realized what was feasible with VFX, depending on the time and budget we had.

How many people were on your team?
I had about 10 artists and two teams. One worked on the big long seamless 360 shot, and then another team worked on all the other shots. I did most of the finishing of the long halftime show sequence on Autodesk Flame, with assistance from three other artists on Nuke, and I parceled out various bits to them — “take out this shadow,” “remove this lens flare” and so on — and did the complete assembly to make it feel seamless on Flame. I also did all the timing of the crowd plates on Flame. Ultimately, the whole job took us about two months to complete, and it was demanding but a lot of fun to work on.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Color grading Togo with an Autochrome-type look

Before principal photography began on the Disney+ period drama Togo, the film’s director and cinematographer, Ericson Core, asked Company 3 senior colorist Siggy Ferstl to help design a visual approach for the color grade that would give the 1920s-era drama a unique look. Based on a true story, Togo is named for the lead sled dog on Leonhard Seppala’s (Willem Dafoe) team and tells the story of their life-and-death relay through Alaska’s tundra to deliver diphtheria antitoxin to the desperate citizens of Nome.

Siggy Ferstl

Core wanted a look that was reminiscent of the early color photography process called Autochrome, as well as an approach that evoked an aged, distressed feel. Ferstl, who recently colored Lost in Space (Netflix) and The Boys (Amazon), spent months — while not working on other projects — developing new ways of building this look using Blackmagic’s Resolve 16.

Many of Ferstl’s ideas were realized using the new Fusion VFX tab in Resolve 16. It allowed him to manipulate images in ways that took his work beyond the normal realm of color grading and into the arena of visual effects.

By the time he got to work grading Togo, Ferstl had already created looks that had some of the visual qualities of Autochrome melded with a sense of age, almost as if the images were shot in that antiquated format. Togo “reflects the kind of style that I like,” explains Ferstl. “Ericson, as both director and cinematographer, was able to provide very clear input about what he wanted the movie to look like.”

In order for this process to succeed, it needed to go beyond the appearance of a color effect seemingly just placed “on top” of the images. It had to feel organic and interact with the photography, to seem embedded in the picture.

A Layered Approach
Ferstl started this large task by dividing the process into a series of layers that would work together to affect the color, of course, but also to create lens distortion, aging artifacts and all the other effects. A number of these operations would traditionally be sent to Company 3’s VFX department or to an outside vendor to be created by their artists and returned as finished elements. But that kind of workflow would have added an enormous amount of time to the post process. And, just as importantly, all these effects and color corrections needed to work interactively during grading sessions at Company 3 so Ferstl and Core could continuously see and refine the overall look. Even a slight tweak to a single layer could affect how other layers performed, so Ferstl needed complete, realtime control of every layer for every fine adjustment.

Likewise the work of Company 3 conform artist Paul Carlin could not be done in the way conform has typically been done. It couldn’t be sent out of Resolve and into a different conform/compositing tool, republished to the company network and then returned to Ferstl’s Resolve timeline. This would have taken too long and wouldn’t have allowed for the interactivity required in grading sessions.

Carlin needed to be able to handle the small effects that are part of the conform process — split screens, wire removals, etc. — quickly, and that meant working from the same media Ferstl was accessing. Carlin worked entirely in Resolve using Fusion for any cleanup and compositing effects — a practice becoming more and more common among conform artists at Company 3. “He could do his work and return it to our shared timeline,” Ferstl says. “We both had access to all the original material.”


Most of the layers actually consisted of multiple sublayers. Here is some detail:
Texture: This group of sublayers was based on overlaid textures that Ferstl created to have a kind of “paper” feel to the images. There were sublayers based on photographs of fabrics and surfaces that all play together to form a texture over the imagery.
Border: This was an additional texture that darkened portions of the edges of the frame. It inserts a sense of a subtle vignette or age artifact that framed the image. It isn’t consistent throughout; it continually changes. Sublayers bring to the images a bit of edge distortion that resembles the look of diffraction that can happen to lenses, particularly lenses from the early 20th century, under various circumstances.
Lens effects: DP Core shot with modern lenses built with very evolved coatings, but Ferstl was interested in achieving the look of uncoated and less-refined optics of the day. This involved the creation of sublayers of subtle distortion and defocus effects.
Stain: Ferstl applied a somewhat sepia-colored stain to parts of the image to help with the aging effect. He added a hint of additional texture and brought some sepia to some of the very bluish exterior shots, introducing hints of warmth into the images.
Grain-like effect: “We didn’t go for something that exactly mimicked the effect of film grain,” Ferstl notes. “That just didn’t suit this film. But we wanted something that has that feel, so using Resolve’s Grain OFX, I generated a grain pattern, rendered it out and then brought it back into Resolve and experimented with running the pattern at various speeds. We decided it looked best slowed to 6fps, but then it had a steppiness to it that we didn’t like. So I went back and used the tool’s Optical Flow in the process of slowing it down. That blends the frames together, and the result provided just a hint of old-world filmmaking. It’s very subtle and more part of the overall texture.”

Combining Elements
“It wasn’t just a matter of stacking one layer on top of the other and applying a regular blend. I felt it needed to be more integrated and react subtly with the footage in an organic-looking way,” Ferstl recalls.

One toolset he used for this was a series of customized lens flare using Resolve’s OFX, not for their actual purpose but as the basis of a matte. “The effect is generated based on highlight detail in the shot,” explains Ferstl. “So I created a matte shape from the lens flare effect and used that shape as the basis to integrate some of the texture layers into the shots. It’s the textures that become more or less pronounced based on the highlight details in the photography and that lets the textures breathe more.”

Ferstl also made use of the Tilt-Shift effect in Fusion that alters the image in the way movements within a tilt/shift lens would. He could have used a standard Power Window to qualify the portion of the image to apply blur to, but that method applied the effect more evenly and gave a diffused look, which Ferstl felt wasn’t like a natural lens effect. Again, the idea was to avoid having any of these effects look like some blanket change merely sitting on top of the image.

“You can adjust a window’s softness,” he notes, “but it just didn’t look like something that was optical… it looked too digital. I was desperate to have a more optical feel, so I started playing around with the Tilt-Shift OFX and applying that just to the defocus effect.

“But that only affected the top and bottom of the frame, and I wanted more control than that,” he continues. “I wanted to draw shapes to determine where and how much the tilt/shift effect would be applied. So I added the Tilt-Shift in Fusion and fed a poly mask into it as an external matte. I had the ability to use the mask like a depth map to add dimensionality to the effect.”

As Ferstl moved forward with the look development, the issue that continually came up was that while he and Core were happy with the way these processes affected any static image in the show, “as soon as the camera moves,” Ferstl explains, “you’d feel like the work went from being part of the image to just a veil stuck on top.”

He once again made use of Fusion’s compositing capabilities: The delivery spec was UHD, and he graded the actual photography in that resolution. But he built all the effects layers at the much larger 7K. “With the larger layers,” he says, “if the camera moved, I was able to use Fusion to track and blend the texture with it. It didn’t have to just seem tacked on. That really made an enormous difference.”

Firepower
Fortunately for Ferstl, Company 3’s infrastructure provided the enormous throughput, storage and graphics/rendering capabilities to work with all these elements (some of which were extremely GPU-intensive) playing back in concert in a color grading bay. “I had all these textured elements and external mattes all playing live off the [studio’s custom-built] SAN and being blended in Resolve. We had OpenFx plugins for border and texture and flares generated in real time with the swing/tilt effect running on every shot. That’s a lot of GPU power!”

Ferstl found this entire experience artistically rewarding, and looks forward to similar challenges. “It’s always great when a project involves exploring the tools I have to work with and being able to create new looks that push the boundaries of what my job of colorist entails.”

MovieLabs, Film Studios Release ‘Future of Media Creation’ White Paper

MovieLabs (Motion Pictures Laboratories), a nonprofit technology research lab that works jointly with member studios Sony, Warner Bros., Disney, Universal and Paramount, has published a new white paper presenting an industry vision for the future of media creation technology by 2030.

The paper, co-authored by MovieLabs and technologists from Hollywood studios, paints a bold picture of future technology and discusses the need for the industry to work together now on innovative new software, hardware and production workflows to support and enable new ways to create content over the next 10 years. The white paper is available today for free download on the MovieLabs website.

The 2030 Vision paper lays out key principles that will form the foundation of this technological future, with examples and a discussion of the broader implications of each. The key principles envision a future in which:

1. All assets are created or ingested straight to the cloud and do not need to move.
2. Applications come to the media.
3. Propagation and distribution of assets is a “publish” function.
4. Archives are deep libraries with access policies matching speed, availability and security to the economics of the cloud.
5. Preservation of digital assets includes the future means to access and edit them.
6. Every individual on a project is identified and verified and their access permissions are efficiently and consistently managed.
7. All media creation happens in a highly secure environment that adapts rapidly to changing threats.
8. Individual media elements are referenced, tracked, interrelated and accessed using a universal linking system.
9. Media workflows are non-destructive and dynamically created using common interfaces, underlying data formats and metadata.
10. Workflows are designed around realtime iteration and feedback.

Rich Berger

“The next 10 years will bring significant opportunities, but there are still major challenges and inherent inefficiencies in our production and distribution workflows that threaten to limit our future ability to innovate,” says Richard Berger, CEO of MovieLabs. “We have been working closely with studio technology leaders and strategizing how to integrate new technologies that empower filmmakers to create ever more compelling content with more speed and efficiency. By laying out these principles publicly, we hope to catalyze an industry dialog and fuel innovation, encouraging companies and organizations to help us deliver on these ideas.”

The publication of the paper will be supported with a panel discussion at the IBC Conference in Amsterdam. The panel, “Hollywood’s Vision for the Future of Production in 2030,” will include senior technology leaders from the five major Hollywood motion picture studios. It will take place on Sunday, September 15 at 2:15pm in the IBC Conference in the Forum room of the RAI. postPerspective’s Randi Altman will moderate the panel made up of Sony’s Bill Baggelaar, Disney’s Shadi Almassizadeh, Universal’s Michael Wise and Paramount’s Anthony Guarino. More details can be found here.

“Sony Pictures Entertainment has a deep appreciation for the role that current and future technologies play in content creation,” says CTO of Sony Pictures Don Eklund. “As a subsidiary of a technology-focused company, we benefit from the power of Sony R&D and Sony’s product groups. The MovieLabs 2030 document represents the contribution of multiple studios to forecast and embrace the impact that cloud, machine learning and a range of hardware and software will have on our industry. We consider this a living document that will evolve over time and provide appreciated insight.”

According to Wise, SVP/CTO at Universal Pictures, “With film production experiencing unprecedented growth, and new innovative forms of storytelling capturing our audiences’ attention, we’re proud to be collaborating across the industry to envision new technological paradigms for our filmmakers so we can efficiently deliver worldwide audiences compelling entertainment.”

For those not familiar with MovieLabs, their stated goal is “to enable member studios to work together to evaluate new technologies and improve quality and security, helping the industry deliver next-generation experiences for consumers, reduce costs and improve efficiency through industry automation, and derive and share the appropriate data necessary to protect and market the creative assets that are the core capital of our industry.”

Netflix hires Leon Silverman to enhance global post operation

By Adrian Pennington

Veteran postproduction executive Leon Silverman was pondering the future when Netflix came calling. The former president of Laser Pacific has spent the last decade building up Disney’s in-house digital post production wing as general manager, but will be taking on what is arguably one of the biggest jobs in the industry — director, post operations and creative services at Netflix.

“To tell you the truth, I wasn’t looking for a new job. I was looking to explore the next chapter of my life,” said Silverman, announcing the news at the HPA Tech Retreat last month.

“The fact is, if there is any organization or group of people anywhere that can bring content creators together with creative technology innovation in service of global storytelling, it is Netflix. This is a real opportunity to work closely with the creative community and with partners to create a future industry worthy of its past.”

That final point is telling. Indeed, Silverman’s move from one of the titans of Hollywood to the powerhouse of digital is symbolic of an industry passing the baton of innovation.

“In some ways, moving to Netflix is a culmination of everything I have been trying to achieve throughout my career,” says Silverman. “It’s about the intersection of technology and creativity, that nexus where art and science meet in order to innovate new forms of storytelling. Netflix has the resources, the vision and the talent to align these goals.”

L-R: Leon Silverman and Sean Cooney

Silverman will report to Sean Cooney, Netflix, director worldwide post production. During his keynote at the HPA Tech Retreat, Cooney introduced Silverman and his new role. He noted that the former president of the HPA (2008-2016) had built and run some of the most cutting-edge facilities on the planet.

“We know that there is work to be done on our part to better serve our talent,” says Cooney. “We were looking for someone with a deep understanding of the industry’s long and storied history of entertainment creation. Someone who knows the importance of working closely with creatives and has a vision for where things are going in the future.”

Netflix global post operation is centered in LA where it employs the majority of its 250 staff and will oversee delivery of 1,000 original pieces of programming this year. But with regional content increasingly important to the growth of the organization, Cooney and Silverman’s tricky task is to streamline core functions like localization, QC, asset management and archive while increasing output from Asia, Latin America and Europe.

“One of the challenges is making sure that the talent we work with feel they are creatively supported even while we operate on a such a large scale,” explains Cooney. “We want to continue to provide a boutique experience even as we expand.”

There’s recognition of the importance to Netflix of its relationship with dozens of third-party post houses, freelance artists and tech vendors.

“Netflix has spent a lot of time cultivating deep relationships in the post community, but as we get more and more involved in upstream production we want to focus on reducing the friction between the creative side of production and the delivery side,” says Silverman. “We need to redesign our internal workflows to really try to take as much as friction out of the process as possible.”

Netflix: Black Mirror – Bandersnatch

While this makes sense from a business point of view, there’s a creative intent too. Bandersnatch, the breakthrough interactive drama from the Black Mirror team, could not have been realized without close collaboration from editorial all the way to user interface design.

“We developed special technology to enable audience interaction but that had to work in concert with our engineering and product teams and with editorial and post teams,” says Cooney.

Silverman likens this collapse of the traditional role of post into the act of production itself as “Post Post.” It’s an industry-wide trend that will enable companies like Netflix to innovate new formats spanning film, TV and immersive media.

“We are at a time and a place where the very notion of a serial progression from content inception to production to editorial then finish to distribution is anachronistic,” says Silverman. “It’s not that post is dead, it’s just that ‘post’ is not ‘after’ anything as much as it has become the underlying fabric of content creation, production and distribution. There are some real opportunities to create a more expansive, elegant and global ability to enable storytellers of all kinds to make stories of all kinds — wherever they are.”


UK-based Adrian Pennington is a professional journalist and editor specializing in the production, the technology and the business of moving image media.

Making audio pop for Disney’s Mary Poppins Returns

By Jennifer Walden

As the song says, “It’s a jolly holiday with Mary.” And just in time for the holidays, there’s a new Mary Poppins musical to make the season bright. In theaters now, Disney’s Mary Poppins Returns is directed by Rob Marshall, who with Chicago, Nine and Into the Woods on his resume, has become the master of modern musicals.

Renée Tondelli

In this sequel, Mary Poppins (Emily Blunt) comes back to help the now-grown up Michael (Ben Whishaw) and Jane Banks (Emily Mortimer) by attending to Michael’s three children: Annabel (Pixie Davies), John (Nathanael Saleh) and Georgie (Joel Dawson). It’s a much-needed reunion for the family as Michael is struggling with the loss of his wife.

Mary Poppins Returns is another family reunion of sorts. According to Renée Tondelli, who along with Eugene Gearty, supervised and co-designed the sound, director Marshall likes to use the same crews on all his films. “Rob creates families in each phase of the film, so we all have a shorthand with each other. It’s really the most wonderful experience you can have in a filmmaking process,” says Tondelli, who has worked with Marshall on five films, three of which were his musicals. “In the many years of working in this business, I have never worked with a more collaborative, wonderful, creative team than I have on Mary Poppins Returns. That goes for everyone involved, from the picture editor down to all of our assistants.”

Sound editorial took place in New York at Sixteen 19, the facility where the picture was being edited. Sound mixing was also done in New York, at Warner Bros. Sound.

In his musicals, Marshall weaves songs into scenes in a way that feels organic. The songs are coaxed from the emotional quotient of the story. That’s not only true for how the dialogue transitions into the singing, but also for how the music is derived from what’s happening in the scene. “Everything with Rob is incredibly rhythmic,” she says. “He has an impeccable sense of timing. Every breath, every footstep, every movement has a rhythmic cadence to it that relates to and works within the song. He does this with every artform in the production — with choreography, production design and sound design.”

From a sound perspective, Tondelli and her team worked to integrate the songs by blending the pre-recorded vocals with the production dialogue and the ADR. “We combined all of those in a micro editing process, often syllable by syllable, to create a very seamless approach so that you can’t really tell where they stop talking and start singing,” she says.

The Conversation
For example, near the beginning of the film, Michael is looking through the attic of their home on Cherry Tree Lane as he speaks to the spirit of his deceased wife, telling her how much he misses her in a song called “The Conversation.” Tondelli explains, “It’s a very delicate scene, and it’s a song that Michael was speaking/singing. We constantly cut between his pre-records and his production dialogue. It was an amazing collaboration between me, the supervising music editor Jennifer Dunnington and re-recording mixer Mike Prestwood Smith. We all worked together to create this delicate balance so you really feel that he is singing his song in that scene in that moment.”

Since Michael is moving around the attic as he’s performing the song, the environment affects the quality of the production sound. As he gets closer to the window, the sound bounces off the glass. “Mike [Prestwood Smith] really had his work cut out for him on that song. We were taking impulse responses from the end of the slates and feeding them into Audio Ease’s Altiverb to get the right room reverb on the pre-records. We did a lot of impulse responses and reverbs, and EQs to make that scene all flow, but it was worth it. It was so beautiful.”

The Bowl
They also captured impulse responses for another sequence, which takes place inside a ceramic bowl. The sequence begins with the three Banks children arguing over their mother’s bowl. They accidentally drop it and it breaks. Mary and Jack (Lin-Manuel Miranda) notice the bowl’s painted scenery has changed. The horse-drawn carriage now has a broken wheel that must be fixed. Mary spins the bowl and a gust of wind pulls them into the ceramic bowl’s world, which is presented in 2D animation. According to Tondelli, the sequence was hand-drawn, frame by frame, as an homage to the original Mary Poppins. “They actually brought some animators out of retirement to work on this film,” she says.

Tondelli and co-supervising sound editor/co-sound designer Eugene Gearty placed mics inside porcelain bowls, in a porcelain sink, and near marble tiles, which they thumped with rubber mallets, broken pieces of ceramic and other materials. The resulting ring-out was used to create reverbs that were applied to every element in the ceramic bowl sequence, from the dialogue to the Foley. “Everything they said, every step they took had to have this ceramic feel to it, so as they are speaking and walking it sounds like it’s all happening inside a bowl,” Tondelli says.

She first started working on this hand-drawn animation sequence when it showed little more than the actors against a greenscreen with a few pencil drawings. “The fastest and easiest way to make a scene like that come alive is through sound. The horse, which was possibly the first thing that was drawn, is pullling the carriage. It dances in this syncopated rhythm with the music so it provides a rhythmic base. That was the first thing that we tackled.”

After the carriage is fixed, Mary and her troupe walk to the Royal Doulton Music Hall where, ultimately, Jack and Mary are going to perform. Traditionally, a music hall in London is very rowdy and boisterous. The audience is involved in the show and there’s an air of playfulness. “Rob said to me, ‘I want this to be an English music hall, Renée. You really have to make that happen.’ So I researched what music halls were like and how they sounded.”

Since the animation wasn’t complete, Tondelli consulted with the animators to find out who — or rather what — was going to be in the audience. “There were going to be giraffes dressed up in suits with hats and Indian elephants in beautiful saris, penguins on the stage dancing with Jack and Mary, flamingos, giant moose and rabbits, baby hippos and other animals. The only way I thought I could do this was to go to London and hire actors of all ages who could do animal voices.”

But there were some specific parameters that had to be met. Tondelli defines the world of Mary Poppins Returns as being “magical realism,” so the animals couldn’t sound too cartoony. They had to sound believably like animal versions of British citizens. Also, the actors had to be able to sing in their animal voices.

According to Tondelli, they recorded 15 actors at a time for a period of five days. “I would call out, ‘Who can do an orangutan?’ And then the actors would all do voices and we’d choose one. Then they would do the whole song and sing out and call out. We had all different accents — Cockney, Welsh and Scottish,” she says. “All the British Isles came together on this and, of course, they all loved Mary and knew all the songs so they sang along with her.”

On the Dolby Atmos mix, the music hall scene really comes alive. The audience’s voices are coming from the rafters and all around the walls and the music is reverberating into the space — which, by the way, no longer sounds like it’s in a ceramic bowl even though the music hall is in the ceramic bowl world. In addition to the animal voices, there are hooves and paws for the animals’ clapping. “We had to create the clapping in Foley because it wasn’t normal clapping,” explains Tondelli. “The music hall was possibly the most challenging, but also the funnest scene to do. We just loved it. All of us had a great time on it.”

The Foley
The Foley elements in Mary Poppins Returns often had to be performed in perfect sync with the music. On the big dance numbers, like “Trip the Light Fantastic,” the Foley was an essential musical element since the dances were reconstructed sonically in post. “Everything for this scene was wiped away, even the vocals. We ended up using a lot of the records for this one and a lot less production sound,” says Tondelli.

In “Trip the Light Fantastic,” Jack is bringing the kids back home through the park, and they emerge from a tunnel to see nearly 50 lamplighters on lampposts. Marshall and John DeLuca (choreographer/producer/screen story writer) arranged the dance to happen in multiple layers, with each layer doing something different. “The background dancers were doing hand slaps and leg swipes, and another layer was stepping on and off of these slate surfaces. Every time the dancers would jump up on the lampposts, they’d hit it and each would ring out in a different pitch,” explains Tondelli.

All those complex rhythms were performed in Foley in time to the music. It’s a pretty tall order to ask of any Foley artist but Tondelli has the perfect solution for that dilemma. “I hire the co-choreographers (for this film, Joey Pizzi and Tara Hughes) or dancers that actually worked on the film to do the Foley. It’s something that I always do for Rob’s films. There’s such a difference in the performance,” she says.

Tondelli worked with the Foley team of Marko Costanzo and George Lara at c5 Sound in New York, who helped to build custom surfaces — like a slate-on-sand surface for the lamplighter dance — and arrange multi-surface layouts to optimally suit the Foley performer’s needs.

For instance, in the music hall sequence, the dance on stage incorporates books, so they needed three different surfaces: wood, leather and a papery-sounding surface set up in a logical, easily accessible way. “I wanted the dancer performing the Foley to go through the entire number while jumping off and on these different surfaces so you felt like it was a complete dance and not pieced together,” she says.

For the lamplighter dance, they had a big, thick pig iron pipe next to the slate floor so that the dancer performing the Foley could hit it every time the dancers on-screen jumped up on the lampposts. “So the performer would dance on the slate floor, then hit the pipe and then jump over to the wood floor. It was an amazingly syncopated rhythmic soundtrack,” says Tondelli.

“It was an orchestration, a beautiful sound orchestra, a Foley orchestra that we created and it had to be impeccably in sync. If there was a step out of place you’d hear it,” she continues. “It was really a process to keep it in sync through all the edit conforms and the changes in the movie. We had to be very careful doing the conforms and making the adjustments because even one small mistake and you would hear it.”

The Wind
Wind plays a prominent role in the story. Mary Poppins descends into London on a gust of wind. Later, they’re transported into the ceramic bowl world via a whirlwind. “It’s everywhere, from a tiny leaf blowing across the sidewalk to the huge gale in the park,” attests Tondelli. “Each one of those winds has a personality that Eugene [Gearty] spent a lot of time working on. He did amazing work.”

As far as the on-set fans and wind machines wreaking havoc on the production dialogue, Tondelli says there were two huge saving graces. First was production sound mixer Simon Hayes, who did a great job of capturing the dialogue despite the practical effects obstacles. Second was dialogue editor Alexa Zimmerman, who was a master at iZotope RX. All told, about 85% of the production dialogue made it into the film.

“My goal — and my unspoken order from Rob — was to not replace anything that we didn’t have to. He’s so performance-oriented. He arduously goes over every single take to make sure it’s perfect,” says Tondelli, who also points out that Marshall isn’t afraid of using ADR. “He will pick words from a take and he doesn’t care if it’s coming from a pre-record and then back to ADR and then back to production. Whichever has the best performance is what wins. Our job then is to make all of that happen for him.”


Jennifer Walden is a New Jersey-based audio engineer and writer. You can follow her on Twitter @audiojeny