NBCUni 9.5.23

Category Archives: VFX

Creating Titles for Netflix’s Avatar: The Last Airbender

Method Studios collaborated with Netflix on the recently released live-action adaptation of the series, Avatar: The Last Airbender. The series, developed by Albert Kim, follows the adventures of a young Airbender named Aang, and his friends, as they fight to end the Fire Nation’s war and bring balance to the world. Director and executive producer Jabbar Raisani approached Method Studios to create visually striking title cards for each episode — titles that not only nodded to the original animated series, but also lived up to the visuals of the new adaptation.

The team at Method Studios, led by creative director Wes Ebelhar, concepted and pitched several different directions for the title before deciding to move forward with one called Martial Arts.

“We loved the idea of abstracting the movements and ‘bending’ forms of the characters through three-dimensional brushstrokes,” says Ebelhar. “We also wanted to create separate animations to really highlight the differences between the elements of air, earth, fire and water. For example, with ‘Air,’ we created this swirling vortex, while ‘Earth’ was very angular and rigid. The 3D brushstrokes were also a perfect way to incorporate the different elemental glyphs from the opening of the original series.”

Giving life to the different elemental brushstrokes was no easy task, “We created a custom procedural setup in Houdini to generate the brushstrokes, which was vital for giving us the detail and level of control we needed. Once we had that system built, we were able to pipe in our original previz , and they matched the timing and layouts perfectly. The animations were then rendered with Redshift and brought into After Effects for compositing. The compositing ended up being a huge task as well,” explains Ebelhar. “It wasn’t enough to just have different brush animations for each element, we wanted the whole environment to feel unique for each — the Fire title should feel like its hanging above a raging bonfire, while Water should feel submerged with caustics playing across its surface.”

Ebelhar says many people were involved in bringing these titles to life and gives “a special shout out to Johnny Likens, David Derwin, Max Strizich, Alejandro Robledo Mejia, Michael Decaprio and our producer Claire Dorwart.”

HPA Tech Retreat 2024: Networking and Tech in the Desert

By Randi Altman

Late last month, many of the smartest brains in production and post descended on the Westin Rancho Mirage Golf Resort & Spa in Palm Springs for the annual HPA Tech Retreat. This conference is built for learning and networking; it’s what it does best, and it starts early. The days begin with over 30 breakfast roundtables, where hosts dig into topics — such as “Using AI/ML for Media Content Creation” and “Apprenticeship and the Future of Post” — while the people at their table dig in to eggs and coffee.

Corridor Digital’s Niko Pueringer

The day then kicks further into gear with sessions; coffee breaks inserted for more mingling; more sessions; networking lunches; a small exhibit floor; drinks while checking out the tools; dinners, including Fiesta Night and food trucks; and, of course, a bowling party… all designed to get you to talk to people you might not know and build relationships.

It’s hard to explain just how valuable this event is for those who attend, speak and exhibit. Along with Corridor Digital’s Niko Pueringer talking AI as well as the panel of creatives who worked on Postcard from Earth for the Las Vegas Sphere, one of my personal favorites was the yearly Women in Post lunch. Introduced by Fox’s Payton List, the panel was moderated by Rosanna Marino of IDC LA and featured Daphne Dentz from Warner Bros. Discovery Content Creative Services, Katie Hinsen from Marvel and Kylee Peña from Adobe. The group talked about the changing “landscape of workplace dynamics influenced by #metoo, the arrival of Gen Z into the workforce and the ongoing impact of the COVID pandemic.” It was great. The panelists were open, honest and funny. A definite highlight of the conference.

We reached out to just a few folks to get their thoughts on the event:

Light Iron’s Liam Ford
My favorite session by far was the second half of the Tuesday Supersession. Getting an in-depth walk-through of how AI is currently being used to create content was truly eye-opening. Not only did we get exposed to a variety of tools that I’ve never even heard of before, but we were given insights on what the generative AI components were actually doing to create these images, and that shed a lot of light on where the potential growth and innovation in this process is likely to be concentrated.

I also want to give a shoutout to the great talk by Charles Poynton on what quantum dots actually are. I feel like we’ve been throwing this term around a lot over the last year or two, and few people, if any, knew how the technology was constructed at a base layer.

Charles Poynton

Finally, my general takeaway was that we’re heading into a bit of a Wild West over the next three years.  Not only is AI going to change a lot of workflows, and in ways we haven’t come close to predicting yet, but the basic business model of the film industry itself is on the ropes. Everyone’s going to have to start thinking outside the box very seriously to survive the coming disruption.

Imax’s Greg Ciaccio
Each year, the HPA Tech Retreat program features cutting-edge technology and related implementation. This year, the bench of immensely talented AI experts stole the show.  Year after year, I’m impressed with the practical use cases shown using these new technologies. AI benefits are far-reaching, but generative AI piqued my interest most, especially in the area of image enhancement. Instead of traditional pixel up-rezing, AI image enhancements can use learned images to embellish artists’ work, which can iteratively be sent back and forth to achieve the desired intent.

It’s all about networking at the Tech Retreat.

3 Ball Media Group’s Neil Coleman
While the concern about artificial intelligence was palpable in the room, it was the potential in the tools that was most exciting. We are already putting Topaz Labs Video AI into use in our post workflow, but the conversations are what spark the most discovery. Discussing needs and challenges with other attendees at lunch led to options that we hadn’t considered when trying to get footage from field back to post. It’s the people that make this conference so compelling.

IDC’s Rosanna Marino
It’s always a good idea to hear the invited professionals’ perspectives, knowledge and experience. However, I must say that the 2024 HPA Tech Retreat was outstanding. Every panel, every event was important and relevant. In addition to all the knowledge and information taken away, the networking and bonding was also exceptional.

Picture Shop colorist Tim Stipan talks about working on the Vegas Sphere.

I am grateful to have attended the entire event this year. I would have really missed out otherwise. The variation of topics and how they all came together was extraordinary. The number of attendees gave it a real community feel.

IDC’s Mike Tosti
The HPA Tech Retreat allows you to catch up on what your peers are doing in the industry and where the pitfalls may lie.

AI has come a long way in the last year, and it is time we start learning it and embracing it, as it is only going to get better and more prevalent. There were some really compelling demonstrations during the afternoon of Supersession.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 25 years. 

NBCUni 9.5.23

Foundry Intros Modo 17.0, Bundles With Otoy OctaneRender

Foundry has released Modo 17.0, an update to its 3D software that overhauls internal systems to provide performance increases. These enhancements help artists by providing the interactivity necessary for modern asset creation workflows, with an additional focus on quality-of-life features in multiple areas. Foundry has also bundled Otoy’s Prime version of OctaneRender, which gives artists a speed increase of up to 50x over traditional CPU renderers straight out of the box.

“With 3D asset creation becoming widely adopted, performance is paramount for the future of DCC apps,” says Greg Brown, product manager at Foundry. “Modo 17.0 sets a foundation for increased performance now plus further enhancements well into Modo’s future. Additionally, bundling the Prime version of the OctaneRender from Otoy with Modo 17.0 will speed up the entire experience, from modeling to final render, reducing performance barriers for artists.”

Artists working on Apple Silicon machines will see an additional speed increase of 50% on average, thanks to Modo’s new native macOS ARM build.

With overhauled core systems and granular performance updates to individual tools, Modo, says Foundry, is poised to re-envision 3D workflows. The Modo community can expect a return to more frequent releases for Modo in 2024, which will build on the foundation of 17.0 to further accelerate more aspects of Modo. This 3D application is tailored to enhance the capabilities of experts while also making those capabilities easier for novices to use.

Foundry has enhanced several capabilities of Modo’s powerful modeling tools, including:

  • Decal workflow — It’s now faster and easier to use decals and wrap flat images onto complex surfaces with minimal distortion and no UV creation.
  • Primitive Slice — Users can now clone multiple slices of the same shape at once, making it easier to produce complex patterns. A new Corner Radius feature rounds corners on rectangles and squares so artists can make quick adjustments without switching between presets.
  • Mesh Cleanup — With this tool, users can automatically fix broken geometry and gaps so they can stay productive and avoid interrupting the creative flow.
  • Radial Align — Radial Align turns a selection into a flat circle, but artists frequently need a partial radius and not a complete circle for things like arches. Modo 17.0 ships with the ability to create a partial radial alignment.
  • PolyHaul — PolyHaul combines many of the most used modeling tools into one streamlined tool. This means artists can spend less time jumping between separate tools, helping them to stay in the flow.

“We are thrilled to bundle OctaneRender with Modo 17.0, bringing instant access to the industry’s first and fastest unbiased GPU render engine. Our mission is to democratize high-end 3D content creation, enabling anyone with a modern GPU to create stunning motion graphics and visual effects at a fraction of the cost and time of CPU architectures. We are excited to see how Modo artists integrate OctaneRender’s GPU-accelerated rendering platform into their creative process, including the ability to scale large rendering jobs across near-unlimited decentralized GPU nodes on the Render Network,” says Otoy founder/CEO, Jules Urbach.

 

 

 

 


Masters of the Air: Directors and DP Talk Shoot, VFX and Grade

By Iain Blair

World War II drama Masters of the Air is a nine-episode Apple TV+ limited series that follows the men of the 100th Bomb Group as they conduct perilous bombing raids over Nazi Germany and grapple with the frigid conditions, the lack of oxygen and the sheer terror of combat at 25,000 feet in the air. Starring Austin Butler and Barry Keoghan, it’s the latest project from Steven Spielberg, Tom Hanks and Gary Goetzman, the producing team behind Band of Brothers and The Pacific.

Anna Boden and Ryan Fleck

Ranging in locations from the fields and villages of southeast England to the harsh deprivations of a German POW camp, Masters of the Air is enormous in both scale and scope. It took many years and an army of creatives to bring it to life — such as directors including Anna Boden and Ryan Fleck and DPs including Jac Fitzgerald.

Here, Boden and Fleck (Captain Marvel) talk about the challenges of shooting, editing and posting the ambitious show. In a sidebar, Fitzgerald (True Detective) talks about integrating the extensive VFX and the DI.

After doing Captain Marvel, I guess you guys could handle anything, but this was still a massive project. What were the main challenges?
Anna Boden: We did episodes 5 and 6. I’d say for us, Episode 5 was a big challenge in terms of wrapping our heads around it all. Some of the prep challenges were very big because it’s really a long air battle sequence that takes up almost the entire episode, and we had limited prep and not a ton of time to do previz and work everything out ahead of time. Also, simultaneously, we were prepping Episode 6, which was going to take us on location and to a whole bunch of new spaces that the show had never been to before. Finding those new locations and doing both of those things at once required so much planning, so it was challenging.

How did you handle the big air battle sequence and working with the volume stage?
Boden: You don’t want to show up on the day and wing it. As filmmakers, sometimes it’s really fun to get on-set and block the sequence based on what the actors want to do. But you can’t do that when you’re shooting on a volume stage, where you’re projecting a lot of imagery on the wall around you. You have to plan out so much of what’s going to be there. That was new for us. Even though we’d worked on Captain Marvel and used greenscreen, we’d never used those big-volume LED stages before. It was a really cool learning experience. We learned a lot on the fly and ultimately had fun crafting a pretty exciting sequence.

I assume director Cary Joji Fukunaga and his DP, Adam Arkapaw, set the template in the first four episodes for the look of the whole show, and then you had to carry that across your episodes.
Boden: Yeah. They’d obviously started shooting before us, and so we were studying their dailies and getting a sense of their camera movements and the color palettes and the vibe for the show. It was really helpful. And our DP, Jac Fitzgerald, knows Adam pretty well, so I think that they had a close working relationship. Also, we were able to visit the set while Cary was shooting to get a sense of the vibe. Once we incorporated that, then we were on our own to do our thing. It’s not like we suddenly changed the entire look of the show, but we had the freedom to put our personalities into it.

And one of the great things about the point where we took over is that Episode 5 is its own little capsule episode. We tried to shoot some of the stuff on the base in a similar tone to how they were shooting it. But then, once we got to that monster mission, it became its own thing, and we shot it in our own way. Then, with Episode 6, we were in completely different spaces. It’s a real break from the previous episodes because it’s the midpoint of the season, we’re away from the base, and there’s a big shift in terms of where the story is going. That gave us a little bit of freedom to very consciously shift how we were going to approach the visual language with Jac. It was an organic way to make that change without it feeling like a weird break in the season.

Give us some sense of how integrating all the post and visual effects worked.
Ryan Fleck: We were using the volume stage, so we did have images, and for the aerial battles, we had stuff for the actors to respond to, but they were not dialed in completely. A lot of that happened after the shooting. In fact, most of it did. (Jac can probably help elaborate on that because she’s still involved with the post process for the whole show.) It wasn’t like Mandalorian levels of dialed-in visual effects, where they were almost finished, and the actors could see. In this show, it was more like the actors were responding to previz, but I think that was hugely helpful.

On Captain Marvel, so often actors are just responding to tennis balls and an AD running around the set for eyelines. In this case, it was nice for the actors to see an actual airplane on fire outside their window for their performances to feel fresh.

Did you do a lot of previz?
Fleck: Yeah, we did a lot for those battle sequences in the air, and we worked closely with visual effects supervisor Stephen Rosenbaum, who was integral in pulling all that stuff together.

What did Jac bring to the mix? You hadn’t worked together before, right?
Fleck: No, and we like her energy. She has experience on big movies and small movies, which we appreciate, and so do we. We like those sensibilities. But I think she just has a nice, calm energy. She likes to have fun when she’s working, and so do we, but she’s also very focused on executing the plan. She’s an organized and creative brain that we really appreciated.

Boden: I think that we had a lot of the same reference points when we first started talking, like The Cold Blue, an amazing documentary with a lot of footage that was taken up in the planes during World War II. Filmmakers actually were shooting up there with the young men who were on missions in these bomber planes. That was a really important reference point for us in terms of determining where the cameras can be mounted inside one of these planes. We tried as much as possible to keep those very real camera positions on the missions so that it felt as reality-based and as visceral as possible and not like a Marvel movie. We used some of the color palette from that documentary as well.

It was also Jac’s working style to go to the set and think about how to block things in the shot list… not that we need to stick to that. Once we get in there and work it through with the actors, we all become very flexible, and she’s very flexible as well. Our work styles are very similar, and we got on really well. We like our sets to be very calm and happy instead of chaotic, and she has a very calm personality on-set. We immediately hired her to shoot our next feature after this show, so we’re big fans.

Was it a really tough shoot?
Boden: Yeah. We started shooting in July and finished in October. That’s pretty long for two episodes, but COVID slowed it all down.

Fleck: I’ve never shot in London or the UK before, but I loved it. I loved the crews; I loved the locations. We got to spend time in Oxford, and I fell in love with the place. I really loved exploring the locations. But yes, there were challenges. I think the most tedious stuff was the aerial sequences because we had mounted cameras, and it was just slow. We like to get momentum and move as quickly as we can when shooting.

Even though this is TV, you guys were involved in post to some degree, yes? 
Ryan Fleck: Yes, we did our director’s cuts, and then Gary kept us involved as the cuts progressed. We were able to get back into the edit room even after we delivered our cuts, and we continued to give our feedback to guide the cuts. Typically, TV directors give over their cuts, and then it’s “Adios.” But because we worked so long on it and we had a good relationship with Gary and the actors, we wanted to see this through to the end. So we stayed involved for much longer than I think is typical for episodic directing.

Typically, on our films, we’re involved in all the other post departments, visual effects and sound, every step of the way. But on this series, we were less involved, although we gave notes. Then Jac did all the grading and the rest of the show. She kind of took over and was very involved. She’ll have a lot of insights into the whole DI process. (See Sidebar)

Anna, I assume you love post, and especially editing, as you edited your first four features.
Boden: I love post because it feels like you’ve made all your compromises, and now all you can do is make it better. Now your only job is to make it the best version of itself. It’s like this puzzle, and you have all the time in the world to do the writing again. I absolutely love editing and the process of putting your writing/editing brain back on. You’re forgetting what happened as a director on-set and rethinking how to shape things.

Give us some idea of how the editing worked. Did you also cut your episodes?
Boden: No, we hired an editor named Spencer Averick, who worked on our director’s cut with us. Every director was able to work on their director’s cut with a specific editor, and then there was Mark Czyzewski, the producer’s editor, who worked on the whole series after that. We worked with him after our director’s cut period. We went back into the room, and he was really awesome. We edited in New York for a couple of weeks on the director’s cut, and then we were editing in LA after that in the Playtone offices in Santa Monica.

What were the big editing challenges for both episodes? Just walk us through it a bit.
Boden: I’d say that one of the biggest challenges, at least in terms of the director’s cut, was finding the rhythm of that Episode 5 mission. When you have a long action sequence like that, the challenge is finding the rhythm so that it has the right pace without feeling like it’s barraging you the whole time. It needs places to breathe and places for emotional and character moments, but it still has to keep moving.

Another challenge is making sure viewers know where they are in every plane and every battle throughout the series. That ends up being a big challenge in the edit. You don’t realize it as much when you’re reading a script, but you realize it a lot when you’re in the edit room.

Then, for Episode 6, it was about connecting the stories because in that episode, we have three main characters — Crosby, Rosenthal and Egan — and they’re in three different places on three very separate journeys, in a way. Egan is in a very dark place, and Rosenthal is in a dark place as well, but he finds himself in this kind of palatial place, trying to have a rest. And then Crosby’s having a much lighter kind of experience with a potential love interest. The intercutting between those stories was challenging, just making sure that the tones were connecting and not colliding with each other, or if they were colliding, colliding in a way that was interesting and intentional.

How hands on were Spielberg and Hanks, or did they let you do your own thing?
Fleck: We mostly interacted with Gary Goetzman, who is Tom Hanks’ partner at Playtone. I think those guys [Spielberg and Hanks] were involved with early days of prep and probably late days of post. But in terms of the day-to-day operations, Gary was really the one that we interacted with the most.

Boden: One of the most wonderful things about working with Gary as a producer — and he really is the producer who oversaw this series — is that he’s worked with so many directors in his career and really loves giving them the freedom and support to do what they do best. He gave us so much trust and support to really make the episodes what we wanted them to be.

Looking back now, how would you sum up the whole experience?
Fleck: All of it was challenging, but I think the biggest challenge for us was shooting during COVID. We kept losing crew members day by day, and it got down to the point where everybody had to test every day and wait for their results. We would have crew members waiting three to four hours before they could join us on-set, so that really cut the amount of shooting time we had every day from 11 hours down to six.

Boden: Some days we’d show up and suddenly find out an hour into the day that we weren’t going to get an actor that we were planning to shoot with, so we’d have to rearrange the day and try to shoot without that actor. That was a big challenge.

Fleck: The great thing for me was how much I learned. Back in history class, you get all the big plot points of World War II, but they don’t tell you about how big these B-17s were, how violent it was up in the air for these guys. You think of the D-Day invasion when you think of the great milestones of World War II, but these aerial battles were unbelievably intense, and they were up there in these tin cans; they were so tight and so cold. I just couldn’t believe that these kids were sent into these situations. It was mind-boggling.

Boden: I also learned a lot through the process of reading the material and the research about the history of these specific people in the stories. But I’d say that one of the things that really sticks with me from the experience was working with this group of actors. That felt very special.

DP Jac Fitzgerald on Shooting Masters of the Air

Jac, integrating all the VFX with visual effects supervisor Stephen Rosenbaum must have been crucial.
Yes. When I started the show, I imagined that the majority of the VFX work would be done on the volume stage. But then I realized that he had a whole World War II airfield to create on location. Obviously, we had the tower structure for the airfield, and we had two planes, one of which was being towed. And it was all so cobbled together from the outside.

Jac Fitzgerald

The planes looked like they were complete, but they weren’t moving by themselves. They didn’t have engines in them or anything. What was interesting to me was the extent of the visual effects that Stephen had to do on the exteriors. We only had two plane bodies, but at any one time when you see the airstrip, there are 12 planes there or more. So there was a huge amount of work for him to do in that exterior world, which was actually as important as the VFX in the volume.

What about the DI? Where did you do all the grading?
It was predominantly in LA at Picture Shop with colorist Steven Bodner, who did the whole show. And because of the enormous amount of VFX, it was obvious early on that things were going to need to be done out of order in the DI.

At first, they thought that my two episodes [5 and 6] would be the first ones to have the DI, as Adam Arkapaw was unavailable to do his episodes [1 through 4] because he was working on another film. At the time they thought they would go in and do my episodes and start prepping and setting the look for episodes 1 through 4 as well. Then it became clear that the DI schedule would have to adjust because of the enormity of the VFX.

Stephen Rosenbaum spent a lot of time making the footage we’d shot and all the VFX worlds collide. I think he had an extraordinary number of people from vendors around the world involved in the project, so there was certainly a lot of cleaning up to do. We all did a lot of work on the look in the DI, trying to make it as seamless as possible. And then again, because episodes 1 through 4 needed so much VFX work, we did my episodes and then we did 7, 8 and 9, and then we went back to 1 through 4. It was certainly a lot of jumping around. I wish that we could have mapped it all from beginning to end, but it wasn’t to be.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Pure4D

DI4D’s Updated Facial Performance Capture System, Pure4D 2.0

DI4D, a facial capture and animation provider, has introduced Pure4D 2.0, the latest iteration of its proprietary facial performance capture solution. Pure4D has been used to produce hours of facial animation for many AAA game franchises, including Call of Duty: Modern Warfare II and III and F1 21 and 23.

F1

The Pure4D 2.0 pipeline is purpose-built to directly translate the subtleties of an actor’s facial performance onto their digital double. It delivers nuanced facial animation without the need for manual polish or complex facial rigs.

Pure4D 2.0 is built from DI4D’s proprietary facial capture technology, which combines performance data from an HMC (head-mounted camera) with high-fidelity data from a seated 4D capture system to achieve a scale and quality beyond the capabilities of traditional animation pipelines. Pure4D 2.0 is compatible with the DI4D HMC and third-party dual-camera HMCs as well as the DI4D Pro and third-party 4D capture systems.

Behind this process is DI4D’s machine learning technology, which continually learns an actor’s facial expressions, reducing subjective manual clean-up and significantly increasing both the repeatability and efficiency of the pipeline. This makes Pure4D 2.0 ideally suited to AAA video game production.

Pure4D

Call of Duty

Faithfully recreating an actor’s facial performance is key to Pure4D 2.0’s approach, making it possible to emulate the experience of watching an actor in a live-action film or theatrical performance using their digital double.

A digital double refers to an animated character that shares the exact likeness and performance of a single actor, resulting in highly realistic, performance-driven facial animation. It’s a process that preserves the art form of acting while enhancing the believability of the character.

Pure4D’s approach to facial animation has inspired a new short film, Double, starring Neil Newbon, one of the world’s most accomplished video game actors, who won Best Performance at the 2023 Game Awards. Double will use Pure4D 2.0 to capture the nuance of Newbon’s performance, driving the facial animation of his digital double. Scheduled for release during the summer, Double will highlight the increasingly valuable contribution that high-quality acting makes to video game production.

 

 

 

 

 


VFX Supervisor Sam O’Hare on Craig Gillespie’s Dumb Money

By Randi Altman

Remember when GameStop, the aging brick-and-mortar video game retailer, caused a stir on Wall Street thanks to a stock price run-up that essentially resulted from a pump-and-dump scheme?

Director Craig Gillespie took on this crazy but true story in Dumb Money, which follows Keith Gill (Paul Dano), a normal guy with a wife and baby who starts it all by sinking his life savings into the GameStop stock. His social media posts start blowing up, and he makes millions, angering the tried-and-true Wall Street money guys who begin to fight back.Needless to say, things get ugly for both sides.

Sam O’Hare

While this type of film, which has an all-star cast, doesn’t scream visual effects movie, there were 500 shots, many of which involved putting things on computer and phone screens and changing seasons. To manage this effort, Gillespie and team called on New York City-based visual effects supervisor Sam O’Hare.

We reached out to O’Hare to talk about his process on the film.

When did you first get involved on Dumb Money?
I had just finished a meeting at the Paramount lot in LA and was sitting on the Forrest Gump bench waiting for an Uber when I got a call about the project. I came back to New York and joined the crew when they started tech scouting.

So, early on in the project?
It wasn’t too early, but just early enough that I could get a grip on what we’d need to achieve for the film, VFXwise. I had to get up to speed with everything before the shoot started.

Talk about your role as VFX supervisor on the film. What were you asked to do?
The production folks understood that there was enough VFX on the film that it needed a dedicated supervisor. I was on-set for the majority of the movie, advising and gathering data, and then, after the edit came together, I continued through post. Being on-set means you can communicate with all the other departments to devise the best shoot strategy. It also means you can ensure that the footage you are getting will work as well as possible in post and minimize costs in post.

I also acted as VFX producer for the show, so I got the bids from vendors and worked out the budgets with director Craig Gillespie and producer Aaron Ryder. I then distributed and oversaw the shots, aided by my coordinator, Sara Rosenthal. I selected and booked the vendors.

Who were they, and what did they each supply?
Chicken Bone tackled the majority of the bluescreen work, along with some screens and other sequences. Powerhouse covered a lot of the screens, Pete Davidson’s car sequence, the pool in Florida and other elements. Basilic Fly handled the split screens and the majority of the paint and cleanup. HiFi 3D took on the sequences with the trees outside Keith Gill’s house.

I also worked closely with the graphics vendors since much of their work had to be run through a screen look that I designed. Since the budget was tight, I ended up executing around 100 shots myself, mostly the screen looks on the graphics.

There were 500 VFX shots? What was the variety of the VFX work?
The editor, Kirk Baxter, is amazing at timing out scenes to get the most impact from them. To that end we had a lot of split screens to adjust timing on the performances. We shot primarily in New Jersey, with a short stint in LA, but the film was set in Massachusetts and Miami, so there was also a fair amount of paint and environmental work to make that happen. In particular, there was a pool scene that needed some extensive work to make it feel like Florida.

The film took place mostly over the winter, but we shot in the fall, so we had a couple of scenes where we had to replace all of the leafy trees with bare ones. HiFi handled these, with CG trees placed referencing photogrammetry I shot on-set to help layout.

There was a fair amount of bluescreen, both in car and plane sequences and to work around actors’ schedules when we couldn’t get them in the right locations at the right times. We shot background plates and then captured the actors later with matched lighting to be assembled afterward.

Screens were a big part of the job. Can you walk us through dealing with those?
We had a variety of approaches to the screens, depending on what we needed to do. The Robinhood app features heavily in the film, and we had to ensure that the actors’ interaction with it was accurate. To that end, I built green layouts with buttons and tap/swipe sequences for them to follow, which mimicked the app accurately at the time.

For the texting sequence, we set up users on the phones, let the actors text one another and used as much of it as possible. Their natural movements and responses to texts were great. All we did was replace the bubbles at the top of the screen to make the text consistent.

For Roaring Kitty, art department graphics artists built his portfolio and the various website layouts, which were on the screens on the shoot. We used these when we could and replaced some for continuity. We also inserted footage that was shot with a GoPro on-set. This footage was then treated with rough depth matte built in Resolve to give a low-fi cut-out feel and then laid over the top of the graphics for the YouTube section.

The screen look for the close-ups was built using close-up imagery of LED screens, with different amounts of down-rez and re-up-rez to get the right amount of grid look for different screens and levels of zoom. Artists also added aberration, focus falloff, etc.

Any other challenging sequences?
We had very limited background plates for the car sequences that were shot. Many had sun when we needed overcast light, so getting those to feel consistent and without repeating took a fair bit of editing and juggling. Seamlessly merging the leafless CG trees into the real ones for the scene outside Keith Gill’s house was probably the most time-consuming section, but it came out looking great.

What tools did you use, and how did they help?
On-set, I rely on my Nikon D750 and Z6 for reference, HDRI and photogrammetry work.

I used Blackmagic Resolve for all my reviews. I wrote some Python pipeline scripts to automatically populate the timeline with trimmed plates, renders and references all in the correct color spaces from ShotGrid playlists. This sped up the review process a great deal and left me time enough to wrangle the shots I needed to work on.

I did all my compositing in Blackmagic Fusion Studio, but I believe all the vendors worked in Foundry Nuke.


Writer/Director Celine Song Talks Post on Oscar-Nominated Past Lives

By Iain Blair

In her directorial film debut, Past Lives, South Korean-born playwright Celine Song has made a romantic and deceptively simple film that is intensely personal and autobiographical yet universal, with its themes of love, loss and what might have been. Past Lives is broken into three parts spanning countries and decades. First we see Nora as a young girl in South Korea, developing an early bond with her best friend, Hae Sung, before moving with her family to Toronto. Then we see Nora in her early 20s as she reconnects virtually with Hae Sung. Finally, more than a decade later, Hae Sung visits Nora, now a married playwright living in New York. It stars Greta Lee, Teo Yoo and John Magaro.

Celine Song directing Greta Lee

I spoke with Song about the post workflow and making the A24 film, which is Oscar-nominated for Best Picture and Best Original Screenplay. It also just won Best Director and Best Feature at the Independent Spirit Awards.

How did you prep to direct your first film? Did you talk to other directors?
I talked to some amazing directors, but what they all said is that because only I know the film that I’m making, the way it’s going to be prepped is a process that only I can really know. You need really strong producers and department heads, which I was so lucky to have. I was able to draw on their experience and advice for every step of the way.

You shot in Seoul and New York. Was it the same sort of experience or was it different going back to Seoul?
The filmmaking culture is very different in both places. In New York, there is a very strong union, and in Korea there isn’t one. Also, the way that you secure locations is different. In New York, if you want to shoot somewhere, the mayor’s office knows about it. Korea is still a little bit like guerrilla filmmaking. You show up to a location and try to get it right. You can’t really get permits for things in Korea.

The story takes place over three separate timeframes. Did you shoot chronologically?
No. We shot everything in New York City, and then we had a set built for the Skype section. Then we went to Korea, prepped it for another month and shot there for 10 days.

You and your, DP Shabier Kirchner, shot 35mm. What led you to that decision?
It was my very first movie, so I didn’t know how hard it was going to be. I don’t have experience shooting on digital or film. I don’t know anything. I think part of it was first-timer bravery. I don’t know enough to be afraid. That’s where the fearlessness came from. But it was also informed by the conversations I was having with my DP. We talked about the story and how the philosophy of shooting on film is connected to the philosophy of the movie, which is that the movie is about time made tangible and time made visible. It just made sense for it to be shot on film.

Celine Song on-set

You come from the theater, where there is obviously no post production. Was that a steep learning curve for you?
Yes, but you do have a preview period in theater, when you see it in front of an audience, and you keep editing in that way. But more importantly, I’m a writer. So part of post is that I don’t think of the movie as just what I see on screen and all the sound design and every piece of it. To me, it is a piece of text. So just as I would edit a piece of my own writing, I feel like I was looking at the editing process very much like editing text.

Then of course in film, it’s not just the writing on the page. It’s also sound, color, visuals, timing… So in that way, I really felt that editing was about composing a piece of music. I think of film as a piece of music, with its own rhythm and its own beat that it has to move through. So in that way, I think that that’s also a part of the work that I would do as a playwright in the theater, create a world that works like a piece of music from beginning to end.

With all that in mind, I honestly felt like I was the most equipped to do post. I had an entire world to learn; I had never done it before. But with post, I was in my domain. The other thing I really love about editing and VFX in film is that you can control a lot. Let’s say there’s a pole in the middle of the theater space. You have to accept that pole. But in film, you can just delete the pole with VFX. It’s amazing.

Did editor Keith Fraase, who is based in New York, come on-set at all in Korea, or did you send him dailies?
We sent dailies. He couldn’t come on-set because of COVID.

What were the biggest editing challenges on this?
I think the film’s not so far from the way I had written it, so the bigger editing choices were already scripted. The harder bits were things that are like shoe leather — the scenes that hold the movie together but are not the center of the emotion or the center of the story.

One example is when Nora is traveling to Montauk, where we know that she’s going to eventually meet Arthur (who becomes her husband). We were dealing with how much time is required and how to convey time so that when we meet Arthur, it seems like it is an organic meeting and not such a jarring one. I had scripted all this shoe-leather stuff that we had shot – every beat of her journey to Montauk. We had a subway beat; we had a bus beat. We had so many pieces of her traveling to Montauk because I was nervous about it, feeling it was not long enough. But then, of course, when we actually got into the edit, we realized we only needed a few pieces. You just realize that again, the rhythm of it dictates that you don’t need all of it.

Where did you do all the sound mix?
We did it at all at Goldcrest in New York.

Are you very involved in that?
You have no idea. I think that’s the only place where I needed more time. We went over budget… that’s a nicer way to say it. That’s the only part of the post process where I really was demanding so much. I was so obsessed with it. The sound designer’s nickname for me was Ms. Dog Ears. I know different directors have very different processes around sound, but for me, I was in that room with my sound designer Jacob Ribicoff for 14 hours a day, five days a week, and sometimes overtime, for weeks. I wouldn’t leave.

I would stay there because I just know that sound is one of those things that holds the film together. Also, with this movie, the sound design of the cities and how different they are and how it’s going to play with the compositions — I had such a specific idea of how I wanted those things to move. Because again, I do think of a film as a piece of music. So I was pretty crazy about it. But I don’t want people to notice the sound design. I want people to be able to feel like they’re actually just standing in Madison Square Park. I want them to be fully immersed.

Obviously, it’s not a big effects movie, but you have some. How did that go?
I think it’s a bit of a subjective thing. Actually, looking at it, I’m like, “Well, does that seem good to you?” I’m showing it to my production designer and my DP and I’m like, “This looks OK to me, but I wonder if it can be better. Would you look at it?” So I relied on many eyes.

I give credit to Keith, but also to my assistant editor, Shannon Fitzpatrick, who was a total genius at catching any problems with VFX and having such a detailed eye. I think she’s one of the only people who really noticed things that I didn’t notice in the VFX. I’m like, I think that looks fine, and then she would say point to this one thing in the corner that’s not working. There are people at A24 who’re also amazing at catching sound and visuals because that’s their job. They’ll point out what sounds strange or what looks strange. So you have so many people who are part of the process.

Who was the colorist, and how involved were you with the grading?
It was Tom Poole at Company 3, which is where we edited and did color and everything. I love the process because I showed up after Shabier and Tom had already gone through the whole film and graded it. They did amazing, beautiful work. Then I would come in and give notes about certain scenes and then we’d do them. Of course, while they were grading it, they’d send me stills, and I’d give notes on the stills before going into the suite. Also, Shabier and Tom have worked together a lot, so they already kind of had a rhythm for how they wanted to color the film.

What sort of film did you set out to make?
Since this was the first film I’d directed, I felt like the main goal was to discover the language of my movie. It was beyond just trying to tell the story the best way I could, from the script stage to the post. I think that was the goal throughout. But the truth is that I really wanted the language of the film to be my own language, and I wanted to learn and have a revelation for myself of what my movie is.

I know it is partly autobiographical. How much of you is in Nora?
It really was inspired by a true event of sitting between my childhood sweetheart, who had come to visit me from Korea, and my husband who I live with in New York City. So this is very autobiographical, and the feeling that I had in that very personal moment is the inspiration for the whole film. But then once you turn it into a script, which is an objectification process, and then you turn it into a film with hundreds of people — and especially with the cast members who have to play the characters — by that time it has become very much an object. Then with post, it’s about the chiseling. It’s about putting together an object that is to be shared with the world.

A film is so different from writing a play. Was it a big adjustment for you?
I know theater because I was in it for a decade, probably more, so I knew the very fundamental difference between the way a play is made versus how a film is made. For example, I was taught that in theater, time and space is figurative, while time and space in film is literal. So that means there are different kinds of strengths and weaknesses in both mediums when it comes to telling a story that spans decades and continents. And, in this case, because my joke is always that the villains of the story are 24 years and the Pacific Ocean, it actually needs the time and space to be seen literally… because there needs to be a reason why these two lovers are not together. So the children have to be literally there, and Korea and New York City have to feel tangible and literal.

I assume you can’t wait to direct again?
Oh, I can’t wait. I want to wake up and just go to set tomorrow. That’s how I feel. I’m trying to shoot another movie as soon as I can.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Podcast 12.4

Review: HP ZBook Fury 16 G10 Mobile Workstation

By Brady Betzel

HP has been at the forefront of computer workstations that target M&E for multiple decades. To keep up with the high-pressure workloads, HP offers enterprise-level workstations with components that will run 24 hours a day, 7 days a week, 365 days a year. And if they don’t, HP will replace the parts and/or system fast — the 24/7/265 uptime is what makes “workstations” unique when compared to off-the-shelf, consumer-grade computer systems.

To ensure the smoothest experience while using apps, HP tests many of today’s pro applications from ISVs (independent software vendors) — from Autodesk to Avid — with its workstations. The HP ZBook Fury 16 G10 is a mobile workstation that combines power and portability without sacrificing either.

The HP ZBook Fury 16 G10 that I was sent to review includes the following specs:

  • CPU: Intel Core i9-13950HX (up to 5.5 GHz with Intel Turbo Boost technology, 36MB L3 cache, 24 cores, 32 threads)
  • Nvidia pro-grade graphics: RTX 5000 Ada GPU
  • Display: 16-inch DreamColor QHD (3840×2400), WUXGA (1920 x 1200), IPS, anti-glare, 400 nits, 100% sRGB
  • RAM: 64 GB RAM – two DIMMs at 5600MHz DDR5 (four total DIMM slots)
  • Storage: 1TB SSD

In the latest HP ZBook Fury 16 G10, there are quite a few updates. Besides speed/hardware improvements, the most interesting updates include the full-size RGB keyboard with 10 keys. I am a sucker for a 10-key. When I was trying to pay for my own car as a teenager, I worked at Best Buy fixing computers and eventually installing car stereos. One of the things I learned from that job was getting fast at using a 10-key number pad. You know how that helped me in editing? Timecode input. So I love that HP includes the 10-key pad even on a mobile workstation.

The next impressive feature is the RGB backlit keyboard. Sure, you can use it just to show off some fancy rainbow effects, but you can also tie the RGB lights to specific applications, like Adobe’s Premiere Pro and After Effects. To adjust the RGB colors, you need to open an inconveniently titled app called Z Light Space. I would have preferred for HP to have called the app “HP RGB Keyboard” or something easily searchable, but what can you do? The keyboard is fully customizable and comes preloaded with apps like Premiere and After Effects. The default Premiere layout has keys such as “j, k and l” labeled in a nice teal color.

Physically, the HP ZBook Fury 16 G10 is thick. The keyboard feels like it sits an inch above the desk. Even so, it isn’t uncomfortable. The dimensions are 14.29 inches by 9.87 inches by 1.13 inches, and it weighs just over 5lbs. The power supply is large and kind of cumbersome, although it delivers a hefty 230W. I really wish workstation laptops would come with streamlined power supplies… maybe one day. HP includes a one-year parts/labor warranty (not on-site unless you pay extra).

Around the outside of the workstation, there are a lot of useful ports:

  • Right side:
    • one RJ-45
    • one headphone/microphone combo
    • one SuperSpeed USB Type-A 5Gbps signaling rate (charging)
    • one SuperSpeed USB Type-A 5Gbps signaling rate

  • Left side:
    • one power connector
    • two Thunderbolt 4 with USB4 Type-C 40Gbps signaling rate (USB Power Delivery, DisplayPort 1.4, HP Sleep and Charge)
    • one HDMI 2.1
    • one Mini DisplayPort 1.4a

Now on to really what matters… Does the HP ZBook Fury 16 G10 really chew through media in Blackmagic Resolve and Premiere Pro? Yes, it does, and when it is running hard, the fans turn on. The Nvidia RTX A5000 laptop GPU is really impressive considering that it’s stuffed inside such a small form factor. Resolve continually embraces GPU acceleration more than Adobe, in my opinion, and the results of my testing bear that out.

Blackmagic Resolve
Up first is Resolve 18.6.4. Keep in mind that when comparing workstations or GPUs, increased speeds are not always tied to new hardware. Advancements in underlying software efficiency, drivers, firmware updates, etc. will also improve speeds. That said, based on a UHD, 3840×2160 timeline, I edited the following clips together and put a basic color grade on them:

  • ARRI RAW: 3840×2160 24fps – 7 seconds, 12 frames
  • ARRI RAW: 4448×1856 24fps – 7 seconds, 12 frames
  • BMD RAW: 6144×3456 24fps – 15 seconds
  • Red RAW: 6144×3072 23.976fps – 7 seconds, 12 frames
  • Red RAW: 6144×3160 23.976fps – 7 seconds, 12 frames
  • Sony a7siii: 3840×2160 23.976fps – 15 seconds

I then duplicated that timeline but added Blackmagic’s noise reduction. Then I duplicated the timeline again and added sharpening and grain. Finally, I replaced the built-in Resolve noise reduction with a third-party noise reduction plugin from Neat Video. From there, I exported multiple versions: DNxHR 444 10-bit OP1a MXF, DNxHR 444 10-bit MOV, H.264 MP4, H.265 MP4, AV1 MP4 (Nvidia GPUs only) and then an IMF package using the default settings.

Here are my results:

HP ZBook Fury 16 G10

 

DNxHR 444 10-bit MXF DNxHR 444 10-bit MOV H.264 MP4 H.265 MP4 AV1

MP4

IMF
Color Correction Only 00:53 00:48 00:31 00:30 00:33 01:19
CC + Resolve Noise Reduction 02:13 02:13 02:02 02:02 02:02 02:19
CC, Resolve NR, Sharpening, Grain 02:57 02:56 02:48 02:48 02:48 02:58
CC + Neat Video Noise Reduction 03:59 03:59 03:47 03:48 03:51 04:03

Adobe Premiere Pro
I ran similar tests inside Premiere Pro 2024 (24.1), exporting using Adobe Media Encoder. The video assets are the same as the ones I used in Resolve, but I used Adobe’s noise reduction, sharpening and grain filters instead of Resolve’s and Neat Video.

Here are the Premiere Pro Results:

HP ZBook Fury 16 G10

Adobe Premiere Pro 2024 (Individual Exports in Media Encoder)

DNxHR 444 10-bit MXF DNxHR 444 10-bit MOV H.264 MP4 H.265 MP4
Color Correction Only 01:27 01:26 00:45 00:48
CC + NR, Sharpening, Grain 25:47 57:17 46:46 59:21
HP ZBook Fury 16 G10

Premiere Pro 2024 (Simultaneous Exports in Media Encoder)

Color Correction Only 02:15 03:47 03:22 03:22
CC + NR, Sharpening, Grain 30:52 01:08:16 01:03:30 01:03:30

These results are definitely competitive with desktop-size workstations. What makes laptop-size components difficult to design? Heat dissipation and size. HP labels its heat dissipation technology as Vaporforce Thermals. That’s a fancy way of saying that HP takes pride in how it designs its fans and heat spreaders to keep the system as cool as possible, even when rendering hours of content in multimedia apps like Resolve.

HP does a great job at keeping the HP ZBook Fury 16 G10 cool to the touch, which isn’t always the case for workstations. Also, the tool-less design of the HP ZBook Fury 16 G10 is amazing. With one switch, you can remove the bottom panel and begin diagnosing, replacing or upgrading components with little technical know-how. The ease of disassembly is what keeps me loving HP’s workstations. The quickest way to put a bad taste in my mouth is not to allow, or make it extremely difficult to, self-repair or upgrade. It just feels wrong. But luckily HP makes it easy.

With such an impressively powerful mobile workstation comes a large price tag: the HP ZBook Fury 16 G10 I tested retails for just over $9,000 before taxes and shipping. Yikes. But for the power under the hood of the HP ZBook Fury 16 G10, you are essentially getting desktop power in a small form factor. The battery that comes with the Fury is great, I turned off any power saving settings to ensure I was running at full speed, and I was able to get about 2.5 hours of run time while running the PugetBench for Creators benchmark utility on a loop. That is essentially constant video editing and rendering.

While that runtime might seem short, it is actually pretty long when running at full speed. But obviously, staying plugged in is your best option when doing multimedia work. If security is important to you, and we know it is, then HP’s Wolf Security is loaded with protections. You can find out more here.

Summing Up
In the end, the HP ZBook Fury 16 G10 is a pricey but powerful mobile workstation that won’t leave you wishing for a desktop. Add a little docking setup with a couple monitors, and you’ll be flying through your color correction in Resolve, noise reduction with Neat Video or video editing in Premiere Pro.

Honestly, the backlit RGB keyboard seemed like a novelty at first, but I found that I really enjoyed it. Definitely check out the MIL-STD 810H-tested HP ZBook Fury 16 G10 if you are in the market for the highest of high-end mobile workstations, which can play RAW 4K media with little interruption:


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and Uninterrupted: The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Podcast 12.4

Ketama Collective Merges to Form Experiential Studio Bermuda

Ketama Collective, part of the team that won the Grand Prix for Creative Data at Cannes last year, is merging with its two sister companies, Bitgeyser and Pasto, to form one integrated digital creative, production and technology resource known as Bermuda.

The new entity, which has opened a US office in Miami, spans everything from content production for brands to experiential executions and activations, extended realities, metaverse executions, meta-human creations, AI infusions and prototyping, as well as CG animation and design. It is billed by its founders as a creative technology lab that’s focused on offering proficiencies and specializations that global brands are searching for in today’s social media and experience-based landscape.

According to Nico Ferrero, CEO of Bermuda (and formerly MD at Ketama), this move is a natural evolution: Ketama, Bitgeyser and Pasto have frequently collaborated on complex projects for a roster of global clients, he points out. Collectively, their work has been recognized by the industry’s leading awards shows, including a Grand Prix and Gold Lion at Cannes for Stella Artois and GUT, a Silver Lion for LATAM Airlines and McCann, and a Gold Clio for “The Tweeting Pothole” for Medcom and Ogilvy, to name a few.

As it seeks to expand its footprint in the US market after having a location in Buenos Aires, Bermuda has lined up a national sales operation. On the East Coast, Bermuda will be represented by Minerva, led by Mary Knox and Shauna Seresin. Bermuda has also signed with Marla Mossberg and N Cahoots for West Coast representation and Isabel Echeverry and Kontakto for the US Hispanic market,

Bermuda is led by a group of bilingual executives from the three merged companies whose backgrounds encompass everything from agency creative, production and software engineering to experience design and fabrication. In addition to Ferrero, the company’s leaders include chief creative director Santiago Maiz, head of production Agustín Mende, regional new business director Matias Berruezo and CFO Juan Riva.

“Bermuda has opened for business backed by a combined 30 years of experience creating digital content,” Ferrero explains. “We now have a unified team of 50 experts all under one roof: digital artists, AI engineers, animators, industrial designers, software and fabrication engineers and creative technologists who specialize in multimedia executions, as well as specialists in augmented, virtual, and mixed reality content; metaverse executions; and the use of block chain.”

The new company was born after a whirlwind 2023: In the US, experiential/digital and fabrication projects staged in New Orleans, Miami, San Diego and Chicago were created for such agencies as Area 23, David and McCann, and for clients such as Google, Mastercard and pharmaceutical company Boehringer Ingelheim. It also marked the debut of a 52-episode, five-minute show, Dino Pops, that was created in hyper-real 3D animation fully executed in Unreal Engine for NBC’s streaming platform Peacock.

As a multi-brand platform, Bermuda has developed unique experiences with personalized content for literally hundreds of products distributed in Tetra Pak packaging. To date the studio has created more than 1,000 digital experiences representing over 150 household brands marketed across 28 countries.

“Our goal is to go even bigger, with more work from the US market, as we flex our muscles across all of our disciplines,” Ferrero states. “Operating as Bermuda will allow us to produce projects on a larger scale while working in different countries at the same time and while handling more complex and challenging projects. And it allows our clients, both on the agency and brand sides, to consolidate the number of entities they have to deal with while making internal collaboration easier and more efficient.” Besides the newly opened base in Miami, Bermuda currently has its HQ in Buenos Aires and offices in L.A. and Colombia to oversee projects throughout the Americas.

As for how they came up with the name, “It’s the idea of the unknown, this mysterious world,” he says, referring obliquely to the legendary Bermuda Triangle. “When you arrive at an idea, it basically comes from a magical place. How well you execute that idea, and the process by which you do it, sums up what Bermuda means to all of us.”

 

Felix Urquiza

AFX Creative Adds Felix Urquiza as ECD

Creative studio AFX Creative has beefed up its VFX team with the addition of executive creative director Felix Urquiza. He joins with nearly 20 years of experience in the field, working at companies like Method Studios; The Mill; and Team One, heading up the latter’s VFX/CG division TiltShift under the Team One/Publicis Groupe’s USA umbrella.

In his new role at AFX, Urquiza will lead the creative team and develop new strategies. In addition, he will work closely with the studio’s managing director, Nicole Fina, to introduce new clients to AFX and expand its services beyond what it currently offers.

“My goal is to bring a fresh perspective, something more personal and meaningful that will resonate not only with our internal teams but also our clients,” Urquiza notes. “Our work and capabilities are already there, and I am here to help take it to the next level. However, what’s more important to me is bringing an outside perspective to AFX. This will push our team and clients to a higher level of excitement and commitment, elevating our passion and vision of creativity.”

Throughout his career, having an outside perspective is what has propelled Urquiza from being a go-to VFX artist to a creative director and studio director. “I would describe my visual style as modern, clean-cut and pristine,” he explains. “Throughout my career, I have developed both technical and creative skills, and as a result, have become proficient in several areas, including building decks and treatments, writing and designing my own treatments for pitches, and leading the team.”

Early on, Urquiza was inspired to pursue VFX after seeing two James Cameron films. “When I was around 10 to 12 years old, there were two movies that blew my mind,” he recalls: “The Abyss and Terminator 2: Judgment Day. In The Abyss, there is a moment when a ‘water’ creature appears and forms into a girl’s face. I couldn’t understand how they did that. Ever since then, I have been fascinated by movies and how they bring amazing things to life using computers. In my sophomore year of high school, I took an elective for 3D graphics, and on the very first day of that class, I knew this is what I wanted to do. I started researching and connecting the dots, laid out my plan and moved to California. The rest is history.”

Urquiza has used that inspiration while working on projects for Activision, Nike, Bacardi, Samsung, Apple, Lexus, GM, Toyota and many more. In addition, he’s collaborated with agencies such as Team One, Saatchi & Saatchi, Leo Burnett, BBDO, McCann, Omnicom and Argonaut.

What he considers to be his primary career highlights include working on his first-ever film, Pirates of the Caribbean: At World’s End; doing a shoot with Zack Snyder during the opening weekend of 300; working on the game XCOM: The Bureau and being nominated twice for a VES award.

“During my time working at places like The Mill and Method, I gained a lot of experience in understanding what it takes to achieve high-quality work and striving to be the best in the industry,” he says. “I also learned the importance of committing to providing a personalized experience for our clients. At TiltShift, I gained valuable insights into the business side of things, such as navigating holding companies and how the decision-making process impacts the overall success of a business. Drawing from these experiences, I am confident in my ability to set high standards for creative output, collaborate effectively with clients and bring strategic ideas to the table on the business end of things.”

 

Versatile Opens Seamless LED Volume in Vancouver

Film production technology provider Versatile Media has opened a new virtual production facility in South Burnaby, part of the Vancouver metro area. The 44,000-square-foot building features two large soundstages, one of which houses what Versatile says is North America’s first enclosed volume with a seamless ceiling. The building houses a main stage with a bespoke LED volume; a secondary, 13,000-square-foot soundstage for use as traditional filming space; and 10,000 square feet of production offices.

The volume itself stands 83 feet wide and 29 feet high and has an immersive, 270-degree curvature. The seamless structure is equipped with the latest LED panels and technology and was purpose-built for large-scale film projects. Running on Nvidia’s GPU technology and using RTX 6000 Ada Generation GPUs, the facility’s technology supports filming at 8K resolution and can shoot with multiple cameras on-set.

What sets the new Versatile volume apart is the seamless integration of the ceiling and the wall, allowing for uninterrupted shot lensing across the entire volume. This means the ceiling is not just for reflections and lighting but a part of the in-camera framing.

To merge traditional, live-action workflows into the volume setting, ensuring that virtual production adapts to live action as closely as possible, Versatile collaborated with Vancouver-based rigging expert Dave McIntosh. McIntosh crafted the bespoke ceiling structure complete with essential catwalk platforms that ensure easy access to the ceiling portion of the volume.

McIntosh and Versatile worked together to engineer the mechanics of the unique ceiling, allowing efficient removal of LED panels so productions can seamlessly integrate diverse filming equipment. This adaptable solution ensures easy access to sets, makes it possible to suspend sets within the volume and facilitates the integration of lighting equipment. It also creates a convenient way for special effects teams to achieve complex and expansive shots and stunts.

“This adaptability opens up new possibilities for filmmakers using virtual production, making it easier to work on the volume and achieve complex shots,” says McIntosh. “It’s a great example of how collaboration in the film industry drives innovation.”

Versatile collaborated with Sohonet to provide production-grade connectivity and networking infrastructure that links Versatile’s Vancouver previsualization studio with the newly built Burnaby stages.

Oscars: Creating New and Old Sounds for The Creator

By Randi Altman

Director Gareth Edwards’ The Creator takes place in 2055 and tells the story of a war between the human race and artificial intelligence. It follows Joshua Taylor (John David Washington), a former special forces agent who is recruited to hunt down and kill The Creator, who is building an AI super weapon that takes the form of a child.

As you can imagine, the film’s soundscape is lush and helps to tell this futuristic tale, so much so it was rewarded with an Oscar nomination for its sound team: supervising sound editors/sound designers Erik Aadahl and Ethan Van der Ryn, re-recording mixers Tom Ozanich and Dean Zupancic and production sound mixer Ian Voigt.

L-R: Ethan Van der Ryn and Erik Aadahl

We reached out to Aadahl to talk about the audio post process on The Creator, which was shot guerrila style for a documentary feel.

How did you and Ethan collaborate on this one?
Ethan and I have been creative sound partners now for over 17 years. “Mind meld” is the perfect term for us creatively. I think the reason we work so well together is that we are constantly trying to surprise each other with our ideas.

In a sense, we are a lot harder on ourselves than any director and are happiest when we venture into uncharted creative territory with sound. We’ve joked for years that our thermometer for good sound is whether we get goosebumps in a scene. I love our collaboration that way.

How did you split up the work on this one?
We pretty much divide up our duties equally, and on The Creator, we were blessed with an incredible crew. Malte Bieler was our lead sound designer and came up with so many brilliant ideas. David Bach was the ADR and dialogue supervisor, who was in charge of easily one of the most complex dialogue jobs ever, breaking our own records for number of cues, number of spoken languages (some real, some invented), large exterior group sessions and the complexity of robot vocal processing. Jonathan Klein supervised Foley, and Ryan Rubin was the lead music editor for Hans Zimmer’s gorgeous score.

What did director Gareth Edwards ask for in terms of the sound?
Gareth Edwards wanted a sonic style of “retro-futurism” mixed with documentary realism. In a way, we were trying to combine the styles of Terrence Malick and James Cameron: pure expressive realism with pure science-fiction.

Gareth engaged us long before the script was finished — over six years ago — to discuss our approach to this very different film. Our first step was designing a proof-of-concept piece using location scout footage to get the green light, working with Gareth and ILM.

How would you describe the sound?
The style we adopted was to first embrace the real sounds of nature, which we recorded in Cambodia, Laos, Thailand and Vietnam.

For the sound design, Gareth wanted this retro-futurism for much of it, recalling a nostalgia for classic science fiction using analog sound design techniques like vocoders, which were used in the 1970s for films like THX 1138. That style of science fiction could then contrast with the fully futuristic, high-fidelity robot, vehicle and weapon technology.

Gareth wanted sounds that had never been used before and would often make sounds with his mouth that we would recreate. Gareth’s direction for the NOMAD station, which emits tracking beams from Earth’s orbit onto the Earth’s surface, was “It should sound like you’d get cancer if you put your hand in the beam for too long.” I love that kind of direction; Gareth is the best.

This was an international production. What were the challenges of working on different continents and with so many languages?
The Creator was shot on location in eight countries across Asia, including Thailand, Vietnam, Cambodia, Japan and Nepal. As production began, I was in contact with Ian Voigt, the on-location production mixer. He had to adapt to the guerilla-style of filming to invent new methods of wireless boom recording and new methods of working with the novel camera technology, in close contact with Oren Soffer and Greig Fraser, the film’s directors of photography.

Languages spoken included Thai, Vietnamese, Hindi, Japanese and Hindi, and we invented futuristic hybrid languages used by the New Asia AI and the robot characters. The on-location crowds also spoke in multiple languages (some human, some robotic or invented) and required a style of lived-in reality.

Was that the most challenging part of the job? If not, what was?
The biggest challenge was making an epic movie in a documentary/guerilla-style. Every department had to work at the top of its game.

The first giant challenge had to do with dialogue and ADR. Dialogue supervisor David Bach mentioned frequently that this was the most complex film he’d ever tackled. We broke several of our own records, including the number of principle character languages, the number of ADR cues, the amount and variety of group ADR, and the complexity of dialogue processing.

The Creator

Tom Ozanich

Dialogue and music re-recording mixer Tom Ozanich had more radio communication futzes, all tuned to the unique environments, than we’d ever witnessed. Tom also wrangled more robotic dialogue processing channels of all varieties — from Sony Walkman-style robots to the most advanced AI robots — than we’d ever experienced. Gareth wanted audiences to hear the full range of dialogue treatments, from vintage-style sci-fi voices using vocoders to the most advanced tools we now have.

The second big challenge was fulfilling Gareth’s aesthetic goal: Combine ancient and fully futuristic technologies to create sounds that have never been heard before.

What about the tank battle sequence? Walk us through that process.
The first sequence we ever received from Gareth was the tank battle, shot on a floating village in Thailand. For many months, we designed the sound with zero visual effects. A font saying “Tank” or “AI Robot” might clue us in to what was happening. Gareth also chose to use no music in the sequence, allowing us to paint a lush sonic tapestry of nature sounds, juxtaposed with the horrors of war.

He credits editors Joe Walker, Hank Corwin and Scott Morris for having the bravery not to use temp music in this sequence and let the visceral reality of pure sound design carry the sequence.

Our goal was to create the most immersive and out-of-the-box soundscape that we possibly could. With Ethan, we led an extraordinary team of artists who never settled on “good enough.” As is so often the case in any artform, serendipity can appear, and the feeling is magic.

One example is for the aforementioned tanks. We spent months trying to come up with a powerful, futuristic and unique tank sound, but none of the experiments felt special enough. In one moment of pure serendipity, as I was driving back from a weekend of skiing at Mammoth, my car veered into the serrated highway median that’s meant to keep drivers from dozing off and driving off the road. The entire car resonated with a monstrous “RAAAAAAAAHHHHHHMMM!!” and I yelled out, “That’s the sound of the tank!” I recorded it, and that’s the sound in the movie. I have the best job in the world.

The incoming missiles needed a haunting quality, and for the shriek of their descent, we used a recording we did of a baboon. The baboon’s trainer told us that if the baboon witnessed a “theft,” he’d be offended and vocalize. So I put my car keys on the ground and pretended not to notice the trainer snatch the keys away from me and shuffle off. The baboon pointed and let out the perfect shriek of injustice.

What about the bridge sequence?
For this sequence, rudimentary, non-AI bomb robots named G-13 and G-14 (à la DARPA) sprint across the floating village bridge to destroy Alfie, an AI superweapon in the form of a young girl (Madeleine Yuna Voyles). We used the bomb robots’ size and weight to convey an imminent death sentence, their footsteps growing in power and ferocity as the danger approached.

Alfie has a special power over technology, and in one of my favorite moments, G-14 kneels before her instead of detonating. Alfie puts her hand to G-14’s head, and during that touch, we took out all of the sound of the surrounding battle. We made the sound of her special power a deep, humming drone. This moment felt quasi-spiritual, so instead of using synthetic sounds, we used the musical drone of a didgeridoo, an Aboriginal instrument with a spiritual undercurrent.

A favorite sonic technique of ours is to blur the lines between organic and synthetic, and this was one of those moments.

What about the Foley process?
Jonathan Klein supervised the Foley, and Foley artists Dan O’Connell and John Cucci brilliantly brought these robots to life. We have many intimate and subtle moments in the film when Foley was critical in realistically grounding our AI and robot characters to the scene.

The lead character, Joshua, has a prosthetic leg and arm, and there, Foley was vital to contrasting the organic to the inorganic. One example is when Joshua is coming out of the pool at the recovery center — his one leg is barefoot, and his other leg is prosthetic and robotic. These Foley details tell Joshua’s story, demonstrating his physical and, by extension, mental complexity.

What studio did you work out of throughout the process?
We did all of the sound design and editing at our facility on the Warner Bros. studio lot in Burbank.

We broke our own record for the number of mixing stages across two continents. Besides working at WB De Lane Lea in London, we used Stages 5 and 6 at Warner Bros. in Burbank. We were in Stages 2 and 4 at Formosa’s Paramount stages and Stage 1 at Signature Post. This doesn’t even include additional predub and nearfield stages.

The sound team with Gareth Edwards Warner’s Stage 5.

In the mix, both Tom Ozanich and Dean Zupancic beautifully [shifted] from the most delicate and intimate moments, to the most grand and powerful.

Do you enjoy working on VFX-heavy films and sci-fi in particular? Does it give you more freedom in creating sounds that aren’t of this world?
Sound is half of the cinematic experience and is central to the storytelling of The Creator — from sonic natural realism to pure sonic science fiction. We made this combination of the ancient and futuristic for the most unique project I’ve ever had the joy to work on.

Science fiction gives us such latitude, letting us dance between sonic reality and the unreal. And working with amazing visual effects artists allows for a beautiful cross-pollination between sound and picture. It brings out the best in both of our disciplines.

What were some tools you used in your work on The Creator?
The first answer: lots of microphones. Most of the sounds in The Creator are real and organic recordings or manipulated real recordings — from the nature ambiances to the wide range of technologies, from retro to fully futuristic.

Of course, Avid Pro Tools was our sound editing platform, and we used dozens of plugins to make the universe of sound we wanted audiences to hear. We had a special affinity for digital versions of classic analog vocoders, especially for the robot police vocals.

The Oscar-nominated sound team for The Creator pictured with director Gareth Edwards.

Finally, congrats on the nomination. What do you think it was about this film that got the attention of Academy members?
Our credo is “We can never inspire an audience until we inspire ourselves,” and we are so honored and grateful that enough Academy members experienced The Creator and felt inspired to bring us to this moment.

Gareth and our whole team have created a unique cinematic experience. We hope that more of the world not only watches it, but hears it, in the best environment possible.

(Check out this behind-the-scenes video of the team working on The Creator.)

Why Egress Fees are Holding Back M&E

By James Flores

Hollywood has the reputation of being an industry at the forefront of technology, thanks to the advancements of digital filmmaking, VFX and virtual production. But there’s one area where the media and entertainment industry falls short of other industries: new technology powering how files get shared and distributed.

Instead of simply uploading the digital assets of a shoot to the cloud and working remotely, many production companies are still moving physical hard drives as if the internet had never been invented. This is because of a hidden cost involved with the major cloud providers — egress fees (aka download fees). These fees can quickly spiral out of control when a studio tries to embrace the cloud model for all digital assets. Because studios don’t want to run up expensive bills with cloud providers, they’ve now built an entire ecosystem of workarounds to get video files off of sets and into post.

These ecosystems are draining resources by adding complication and subtracting budget, and they are ultimately just as damaging as paying egress fees. The M&E field is small but produces incredible amounts of data. The status quo cloud business model involving egress fees is holding our industry back from taking full advantage of the cloud and unlocking new innovations.

What Is Cloud Egress?
One reason that major cloud providers generate such massive profits is the number of fees and additional charges that they tack on, oftentimes without transparency. This results in huge surprise bills at the end of the month. Egress fees are the cost to customers whenever they move their data files out of the provider’s platform. The average egress fee is $0.09 per gigabyte transferred from storage, regardless of use case. But specific costs are not always apparent and can be difficult to predict. In fact, there’s an entire subindustry of consultants and service providers that manage cloud costs on an organization’s behalf (collecting their own fee in the process). The various fees and charges that don’t seem like much at first glance — or that are presented as just the cost of doing business — quickly add up within common M&E workflows.

The average file size from shooting a single take of a scene is several gigabytes, meaning that even one day of shooting creates a huge price tag anytime footage gets moved in and out of the cloud for multiple rounds of digital effects and editing. It makes planning expenses in advance extremely difficult, as filmmakers can’t know how much it will cost until they’ve uploaded their work to the cloud and started editing. With this virtual roadblock in place, it’s not surprising that many M&E companies feel that it’s unfeasible to embrace the cloud.

The Production Company Hard Drive Ecosystem
In the absence of cloud storage, an ecosystem of hand-delivering hard drives has sprung up to move and protect video files, which is not necessarily beneficial to a production company’s finances. Here’s how it works:

A specialized courier industry exists to serve production teams that need to physically send files to the right location. There are a number of issues with this approach. First, it creates a delay between filming and post production that can be anywhere from a few hours to several weeks, depending on the distance between the shooting location and the editing rooms.

Second, this process generates unnecessary costs. What immediately comes to mind are the packaging, courier and other travel fees from carrying those files around. But there are hidden costs as well. Companies will have to purchase multiple hard drives as the devices wear out, and they must keep up to dozens of drives on hand at set locations, depending on the duration of a particular shoot. And if those drives get lost or damaged, then the entire cost of shooting is wasted, and expensive reshoots become necessary.

Finally, those digital assets on hard drives aren’t necessarily safe. The danger of transporting on-premise (hard drive-stored) work means drives can be lost, held up by a foreign country’s customs department or even stolen if the production is high-profile enough. This adds even more cost for security and transportation experts to protect files against each of these threats.

There will always be some need for hard drives on shoots, such as in remote locations without internet connectivity, which therefore requires temporary storage. However, looking at the costs generated by this on-premise, physical transfer ecosystem, it seems fair to ask what it would look like if that wasn’t the case.

What Could Happen Instead
What’s next is the advent of cloud workflows. Cloud technology has reshaped how most businesses operate. The same is true for the M&E industry. Many different technologies offer the ability to take data (media) directly from a camera’s encoder and move it to the cloud. These camera-to-cloud technologies often create their own data silos; data can only go into the given vendor’s cloud storage, and moving it to other tools invokes costly egress charges. With cheaper cloud egress fees — or even no cloud egress fees at all — production teams could more readily use this cloud workflow, opening up room in studio budgets and speeding up their production time thanks to the elimination of the hard drive ecosystem. This could level the playing field for smaller production companies, as they’d be able to film content much more efficiently.

Companies could focus security investments into digital security, which can be much cheaper than physical methods. Instead of trained guards, companies could rely on encrypted backups and object lock, wherein a user can designate certain objects to be immutable, meaning they cannot be altered or deleted by outsiders and thus are safe from ransomware. They’d also be free to move a lot more post production tools and editing techniques to the cloud, and they could pick and choose where they want to store data or which tools they want to use without worrying about what cloud provider they’d be stuck with.

It’s Time for a Change
With the WGA/SAG-AFTRA strikes and negotiations thankfully behind us, there’s going to be pressure on everyone to get new films and shows finished as soon as possible. These condensed timelines mean it’s now time to talk about why it’s acceptable to waste so much money and time on outdated manual processes. This question is not just for the M&E industry but for the cloud industry as well. By keeping exorbitant egress fees in place, cloud providers hurt their own businesses and limit production companies’ potential. Eliminating, or simply reducing them, would be a net benefit for everyone involved.


James Flores is who has been a working video editor/assistant editor and DIT for over 25 years. He is currently product marketing manager M&E at Backblaze.

Puget Systems Debuts Custom Laptops and SDS Storage

Puget Systems has expanded its product offerings beyond custom desktop workstations into the mobile computing market with the introduction of an entirely new category of custom mobile workstations.

Debuting at this year’s HPA Tech Retreat in Palm Springs, the new Puget Mobile 17-inch will feature high-performance hardware with Intel’s Core i9 14900HX CPU and Nvidia’s GeForce RTX 4090 mobile GPU, all built into a notebook chassis. The 17.3-inch QHD screen has a 240Hz refresh rate and high color accuracy. This combination of high-performance components makes the Puget Mobile 17-inch a  good solution for content creators who demand performance, reliability, quality and ultra-smooth workflows in a mobile form factor.

According to Puget Systems, this move signals the expansion of its strategy to provide broader, more comprehensive solutions for its users’ workflow and performance requirements as they continually seek more flexible, reliable and powerful systems. Based on customer feedback Puget is looking to partner with companies its users trust for white-glove service, support and industry-specific expertise.

Throughout the early development process of the new Puget Mobile 17-inch, the Puget Labs and R&D teams worked closely with select users from multiple industries to collect feedback and ensure they were on track.

“This laptop is about as close as you can get to the performance of a PC tower while actually having something that still works as a laptop,” reports Niko Pueringer, the co-founder of Corridor Digital, who has been using Puget computers for years. “And it provided all the qualities I’d expect out of a Puget system. Oh, and I also like that it’s not loaded up with promotional bloatware…

“There are a lot of machines out there with high specs. Anyone (with enough money) can buy a 4090 and sling it in a case,” continues Pueringer. “What makes Puget special is that all the supporting pieces get the attention they deserve. With Puget, I know that I don’t have any hidden compromises or bottlenecks. All my USB ports will work at the same time. The heat management is capable of handling 100% loads for extended time. I know that all the pipes between the shiny GPUs and CPUs are big and beefy and ready to handle anything I throw at it. This laptop was no exception.”

The Puget Mobile 17-inch custom laptops will be available for configuration for a wide range of applications beginning in Q2.

Embracing Storage, MAM and Archiving
At HPA, Puget has also debuted a new family of custom software-defined storage (SDS) solutions. The new Puget Storage solution— in partnership with OSNexus — uses OSNexus’ QuantaStor platform to provide scalable and agile media asset storage for both on-site and remote users.

Available in a 12-bay and a 24-bay 2U form factor, Puget Storage solutions are capable of up to 1.5TB of RAM and provide growing and established studios with simple, flexible storage with end-to-end security. These scalable, agile media asset storage solutions are ideal for post workflows, media asset management applications and archival services with stringent requirements for the ideal combination of capacity, performance, security and scalability.

Partnering with OSNexus to integrate its QuantaStor platform provides Puget Storage users with a number of key benefits, including:

  • Storage grid technology: Grid technology unifies management of QuantaStor systems across racks, sites and clouds.
  • Security: Advanced RBAC, end-to-end encryption support, complies with NIST 800-53, 800-171, HIPAA, CJIS, and is FIPS 140-2 L1 certified
  • Hardware integration: QuantaStor is integrated with a broad range of systems and storage expansion units, including Seagate, Supermicro and Puget Systems rackmount storage platforms for media and entertainment.
  • Scalable: Integrated with enterprise-grade open storage technologies (Ceph and ZFS)
  • Unified file, block and object: All major storage protocols are supported, including NFS/SMB, iSCSI/FC/NVMeoF and S3.

The new Puget Storage SDS solutions will be available for configuration for a wide range of applications beginning in Q2.

 

 

 

Donal Nolan

Donal Nolan at Helm of Milk’s New Dublin VFX Studio

Milk‘s new Dublin visual effects studio, announced last year, is now fully operational and already at work. One of its first projects is the NBC series Surviving Earth, co-produced by UK indie Loud Minds and Universal Television Alternative Studios. Milk has been tasked with bringing to life prehistoric creatures and environments with high-end VFX.

Donal NolanThis new studio, which is located in Dublin’s historic Grafter House, is part of Milk’s strategy to further build its European presence in order to access talent and grow its capabilities and relationships. As part of this strategy, earlier this year Milk launched a new VFX studio in Barcelona and then opened a new studio in Bordeaux, France. And in 2022, Milk acquired the independent, BAFTA Award-winning VFX studio Lola Post Production.

Leading the new Dublin studio is Donal Nolan, who has been named creative head of studio. With Milk since August 2023, Nolan brings a wealth of experience to the role, having spent his VFX career at top global studios, including Dublin’s Egg, Windmill Lane and VFX union. In 2023 he won an Irish Film and Television Award (IFTA) for VFX in 2023 for The Woman King, a Milk project. He’s also worked on projects including Thor: The Dark World (Marvel Entertainment), The Siege of Jadotville (Netflix), 28 Weeks Later (20th Century Fox), Everybody’s Talking About Jamie (Amazon), The Order (Netflix) and Child’s Play (Orion Pictures), among many others.

Nolan is well-versed in building creative teams and welcomes the opportunity to grow Milk Dublin’s emerging business while continuing to call on VFX talent within the industry in Ireland. He is joined by VFX supervisor Ciaran Crowley, who has also been named part of the leadership team at the Dublin studio.

 

Holdovers

The Holdovers Oscar-Nominated Kevin Tent Talks Editing Workflow

By Iain Blair

Editor Kevin Tent, ACE, has had a long and fruitful collaboration with director Alexander Payne. Their first film together was 1995’s Citizen Ruth, and he’s edited all of Payne’s films since, including the Oscar-nominated Sideways. Tent earned his first Academy Award nod for his work on The Descendants.

Kevin Tent, ACE. Photo by Peter Zakhary

Tent was just Oscar-nominated again, this time for his work on Payne’s new film The Holdovers, a bittersweet holiday story about three lonely people marooned at a New England boarding school over winter break in 1970. I spoke with Tent about his workflow and editing the film, which got five Oscar noms, including for Best Picture.

How did the process start with Alexander?
When he first had the idea, he didn’t even have a full script yet with writer David Hemingson. I read the first few drafts, and it was in very good shape, even early on. I would give him my comments on the script and stuff like that, and he’d take them or not. Then when they began shooting, I started cutting right away.

Alexander told me you went to the set on his very early films like Citizen Ruth, but not really since.
Yeah, as usual, I stayed here in LA for this film, while he shot in Boston. I do like to go to the set just for one day to say hello to the actors and everyone, but after you’re there for 20 minutes and have nothing else to do, you’re like, “What the hell am I doing here?” So I don’t spend too much time on-set, and anyway, there’s so much work to be done at the cutting room, as I’m doing an assembly while he shoots.

I assume you’re in constant contact during the shoot.
Yes, we talk every day, at least once a day. I usually send him cut scenes for the weekend if he wants to see them, but on the last couple of movies, he hasn’t wanted to watch cut scenes while he has a weekend off. I think he’s got too many other production issues on his plate to have the time to do that.

He doesn’t watch dailies anymore either, and it works out really well because by the time he comes back to the cutting room, he’s going to look at the dailies with fresh eyes. And I know the footage fairly well by then because I’ve been through it and cut it. So we’re both kind of on an even playing field when we start to cut together.

I know he shoots very precisely, so it’s not like there’s a ton of material you have to wade through and cut?
Right, he was really focused on his coverage on this, which was good. He’s always super-smart about coverage. He doesn’t want to burn out his actors on wide shots and masters and stuff like that. So he gets what he thinks he needs to get us in and out of scenes. Then he spends a lot of time letting the actors find their footing and their characters and give their performances.

Holdovers

He’s a four-to-six takes guy on average, but he allows his actors to take their time and get these great performances. It’s our job when we get to the cutting room to try to condense them and make them efficient… pace ’em up, that kind of thing. We’re getting great raw material, and then our challenge is usually trying to get it all moving and flowing together.

He told me that you’d work on the edit at his home in Omaha for a while and then come back to LA when you had to spend a lot more time here for post?
Yeah, we would go there for a month, come back to LA and then go back again. I was there for Citizen Ruth and About Schmidt. I like it back there, and we had a good time. He’s got an awesome place.

We would cut there using Jump Desktop — we’d log into that and do all the editing. It was remarkable. It’s just the most amazing thing. And we were able to cut away on the computer in California from his place in Omaha. We would then use Evercast for work sessions with associate editor Mindy, Alyssa, music editor Richard Ford and sound supervisor Frank Gaeta. The whole process was efficient and phenomenal.

The whole thing from shooting to finishing was probably nine months. So not overly long. Then we spent a month or so doing the final mix and the DI and all that stuff.

This is Alexander’s first period film, and I loved all the dissolves you used that you don’t see in movies so much anymore.
Yeah, that’s true I think, but we love them. They’re so beautiful. They create emotion, and I was a fan of them even before Alexander and I started working together. I always thought they were amazing, and we’ve been doing them forever, going back to Citizen Ruth. We also did some really long dissolves in Election and in About Schmidt, which has a bunch of really beautiful dissolve sequences when his wife dies. There’s a huge, two-minute dissolve sequence of Warren Schmidt after she passes.

What was the most difficult scene to edit?
There were a couple. It would seem like they were simple, but we spent a lot of time on them. First, the scene where Paul gets fired at the end and takes the hit for the kid. I wouldn’t say we struggled, but we were constantly finessing it, going back and taking things out and putting things back and trying to get it just right. That one took us quite a while.

Then there were scenes that were a little long that needed condensing just to get them right. The first scene, with Mary and Paul watching The Newlywed Game, was a challenge because it had a fair amount of dialogue that we lost. That was a tricky scene because it had a lot of stuff going on in it. It had emotion about her son and her anger with the kids at the school. There were a lot of different transitions and stuff going on characterwise, and that was a challenge in that little area.

I assume you did a fair amount of music temps?
Yes. Mindy Elliott, who’s been our assistant forever but got an associate editor credit this time around, was the first one who imported music from The Swingle Singers, which is the a cappella Christmas music we hear. It was a great call because I was having trouble “hearing” whatever the music would be in the movie. That ended up being a huge element we embraced — using that type of music throughout.

At points it’s ironic and kind of funny, and at other times it’s very poignant, and it became a really important musical element in the movie. We also worked with our music producer/supervisor Richard Ford, and he’s brilliant. He also started bringing in lots of music, including scores from Mark Orton, whom we’d worked with on Nebraska. That became our score sound of the movie. Then we threw in all our fun ‘60s music — that’s just a free-for-all.  It worked great, but then you find out it costs $100,000, and you can’t get it. That happens all the time.

Obviously, it’s not a big visual effects movie, but there are some. Were you doing temps for those as well?
Yeah, we had things like comps and fluid morph, but the visual effects were really all about evening out the snow. There were certain scenes that needed it, like when all the people are leaving the church and there’s snow coming down. Believe it or not, that scene was shot on the same day as the scene where all the boys are talking out in front of the truck. It was blue skies in the morning, then it was snowing like crazy, then a couple hours later, it was all blue skies again.

Crafty Apes did the visual effects, adding snow, wetting the road, putting clouds in the sky, adding some snowflakes at points and trying to make it match a little closer to what was shot earlier in the day… those kind of things.

Tell us about the post workflow and the editing gear you used.
We cut on Avid Media Composer 2018, supplied by Atlas Digital aka Runway Edit. While in production, associate editor Mindy Elliott, assistant editor Alyssa Donovan-Browning and I worked from our homes, and we worked with separate projects, which we kept updated via Dropbox. We had separate drives as well.

Our dailies were provided by Harbor in New York and London, and each morning — sometime between 2am and 6am — they would send us a downloadable link.  I would check my email around 4am, and if I had a link, I’d start the download and then go back to bed. Around 8am, Mindy would use TeamViewer to log on to my computer and copy the organized dailies bins, etc. onto my local drive. Mindy and Alyssa also had local drives. We communicated constantly using a dedicated Telegram Messenger chat, and whenever I needed anything or there was new media (small amounts), we used TeamViewer and Dropbox to download and import it.

Once Alexander was back in town, we moved into a more traditional cutting room in North Hollywood, and we switched over to a Nexis shared media storage in the same building as our cutting rooms. Once we were finalizing the cut, we moved permanently back to LA to finish, and we mixed in Santa Monica with our longtime mixer, Patrick Cyccone.

Were you involved in the DI at all? Did you go to the sessions?
I didn’t go so much on this one because our DP, Eigil Bryld, was shooting in New York, and Alexander and Eigil did it at Harbor in New York with colorist Joe Gawler. I would see it, and when they were done, I’d go and we’d screen it.

What makes it such an enduring partnership with Alexander?
I think we’re both pretty easygoing guys, and we’re both always looking to enjoy life. We take our work seriously, but we don’t ever let it get ugly. We have a good time when we’re working together, and we work hard, but we keep a positive attitude. I guess we’re just very similar in that respect. And we’re pals after all these years, so it’s not even like going to work when we work together. We basically are doing our job and having fun.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.