Tag Archives: David Fincher

Kirk Baxter on Editing David Fincher’s The Killer

By Iain Blair

David Fincher’s The Killer is a violent thriller starring Michael Fassbender as an unnamed hitman whose carefully constructed life begins to fall apart after a botched hit. Despite his mantra to always remain detached and methodical in his work, he lets it become personal after assassins brutally attack his girlfriend, and soon he finds himself hunting those who now threaten him.

L-R: Kirk Baxter and David Fincher

The Netflix film reunites Fincher with Kirk Baxter, the Australian editor who has worked on all of Fincher’s films since The Curious Case of Benjamin Button and who won Oscars for his work on The Social Network and The Girl with the Dragon Tattoo.

I spoke with Baxter about the challenges and workflow.

How did you collaborate with Fincher on this one?
I try not to weigh David down with too many background questions. I keep myself very reactionary to what is being sent, and David, I think by design, isolates me a bit that way. I’ll read the script and have an idea of what’s coming, and then I simply react to what he’s shot and see if it deviates from the script due to the physicality of capturing things.

The general plan was that the film would be a study of process. When The Killer is in control, everything’s going to be deliberate, steady, exacting and quiet. We live in Ren Klyce’s sound design, and when things deviate from The Killer’s plan, the camera starts to shake. I start to jump-cut, the music from composers Trent Reznor and Atticus Ross comes into the picture, and then all of our senses start to get rocked. It was an almost Zenlike stretching of time in the setup of each story then a race through each kill. That was the overarching approach to editing the film. Then there were a thousand intricate decisions that we made along the way each day.

I assume you never go on-set?
Correct. I just get dailies. Then David and I go back and forth almost daily while he is shooting. [Fincher and DP Erik Messerschmidt, ASC, shot widescreen anamorphic aspect ratio, 2.39×1, with the Red V-Raptor and recorded footage in 8K.] David remains very involved, and he’ll typically text me. Very rarely does he need to call me to talk during an assembly. Our communications are very abbreviated and shorthand. I put assemblies of individual scenes up on Pix, which allows David to be frame-accurate about feedback.

Sometimes I’ll send David selects of scenes, but often on larger scenes, a select sequence can be 30 to 40 minutes long, and it’s difficult for David to consume that much during a day of shooting. So I’ve developed a pattern of sending things that are sort of part edited and part selected. I’ll work out my own path through action, and then I open up and include some multiple choice on performance or nuance — if there are multiple approaches that are worth considering. I like to include David. If I leap to an edit without showing the mathematics of how I got there, often the professor wants to know that you’ve done the research.

The opening Paris stake-out sequence sets up the whole story and tone. I heard that all of Fassbender’s scenes were shot onstage in New Orleans along with the Paris apartment he’s staking out. How tricky was it to put that together?
Yes, it came in pieces. The Paris square and all the exteriors were shot on location in Paris. Then there’s a set inside WeWork, and that came as a separate thing.

What made it more complex was that all the footage of the target across the street came much later than the footage of The Killer’s location. But I still had to create an edit that worked with only The Killer’s side of the story so that Fincher knew he had it. Then he could strike that set and move on. My first version of that scene just had words on the screen [to fill in the blanks] of what was happening across the street. I built it all out with the song “How Soon Is Now?” by The Smiths and The Killer’s inner monologue, which allowed me to work out Fassbender’s best pieces. Then, when I eventually got the other side of the footage, I had to recalibrate all of it so that it wasn’t so pedantic. I had to work out ways to hide the target by the size of the POV or stay on The Killer’s side to allow the scene to stretch to its perfect shooting opportunity, ladling suspense into it.

What were the main challenges of editing the film?
For me, it was a complex film to edit due to how quiet and isolated the lead character is. In the past, I’ve often edited scenes that have a lot of characters and conversation, and the dialogue can help lead you through scenes. There’s a musicality to voices and talking that sometimes makes it obvious how to deliver or exploit the information. Crafting a silent, exacting person moving through space and time called for a different muscle entirely. I often used Fassbender’s most subtle micromovements to push things along. We are always obsessing over detail with Dave’s films, but the observational study of a methodical character seemed to make the microscope more powerful on this one.

It’s very much a world of seeing what he sees, and his temperature controls the pace of the movie. He slows things down; he speeds things up. And that’s the way David covers things — there are always a lot of angles and sizes. There are a lot of choices in terms of how to present information to an audience. It was a very fiddly film to perfect.

As you note, there’s very little dialogue, but there’s a lot of voiceover. Talk about handling that.
Anytime you deal with voiceover, it’s always in flux. It’s quite easy to keep writing a voice-over — keep moving it, keep streamlining it, removing it, bringing it back. That all impacts the picture. We recorded Fassbender performing his monologue four different times, and he became more internal and quieter with the delivery each time. In editing the sniper scene and playing The Smiths in The Killer’s headphones at full volume over all of his POVs, I had to time his voiceover to land on the coverage. That then became a language that we applied to the entire film. POVs never had voice-over, even on scenes when The Killer wasn’t playing music. It created a unique feeling and pacing that we enjoyed.

What was the most difficult scene to cut and why?
The scene with the secretary, Dolores, begging for her life in the bathroom was very challenging because it’s somewhat torturous watching an almost innocent person about to be killed by our lead character. There was a lot of nuance in her performance, so we had to figure out how to manipulate it to make it only slightly unbearable to watch. And that’s always my role. I’m the viewer. I’m the fan. Because I’m not on-set, I’m often the one who’s least informed and trying to make sense of things, learning as I go.

I think the scene with Tilda Swinton (The Expert) was rather difficult as well, probably because she’s so good. The scene was originally a lot longer than it is now, but I had to work the scene out based on what Fassbender was doing, not what Tilda was doing. There are only so many times you can cut to The Killer and his lack of response without diluting that power. So I reduced the scene by about a third in order to give more weight to the lack of vocalization, pushing things forward with the smallest facial performances. That was the scene we played with the most.

There is some humor in the film, albeit dark humor. How tricky was it trying to maneuver that and get it exactly right?
I think there’s always dark humor in David’s movies. He’s a funny guy. I love the humor in it, especially in the fight scene. There’s such brutality in that physical  fight scene, and the humor makes it easier for the audience to watch. It gives you pauses to be able to relax and catch up and brace yourself for what’s coming.

There’s also humor in the voiceover throughout the film. I had to work out the best possible timing for the voice-over and decide what we did and didn’t need. There was a lot of experimentation with that.

Did you use a lot of temp sound?
There was some underwhelming temp stuff that we put in just to get by, but usually sound designer Ren Klyce comes in and does a temp mix before we lock the film. From that point on, we continue to edit with all of his mix splits, which is incredibly helpful.

The same goes for Trent Reznor and Atticus Ross. They scored about 40 minutes of music very early in the process, and that’s how I temped the music in the film — using their palette so we didn’t have to do needle-drops from other films. Working with their music and finding homes for the score is probably the most enjoyable part of film editing for me.

What about temping visual effects?
We do temp effects when they’re based on storytelling and timing, and there are always so many split screens. David often keeps shots locked off so that we can manipulate within a frame using multiple takes. There are a lot of quiet visual effects that are all about enhancing a frame. And we are constantly stabilizing camera work — and in this case destabilizing, adding camera shake during the fight or flight scenes.

There’s a lot of that sort of work with David, so I don’t need to get bogged down with it when I’m getting ready to lock a cut. That all comes afterward, and it’s [all about] enhancing. We mostly communicate storytelling and timing and know that we’re secure in our choices — that’s what I need to deal with while editing.

Did you do many test screenings?
There’s always a trusted crowd that David will show it to, but we didn’t do test screenings in the conventional sense of bringing in piles of strangers to see how they respond. David’s more likely to share with filmmakers and friends.

How long did the edit take to complete?
It was close to a year and then David reshot two scenes. When David’s in Los Angeles, I like to work out of his office in Hollywood so he can casually pop in and out of the cutting room. Then we picked up and went to the south of France to Brad Pitt’s property Miraval. They have cutting rooms there. We worked there in the summer for a couple of months, which was incredible and very focused.

I heard you’re not much of a tech head.
For me, editorial is more of a mindfuck. It’s a head game, much like writing. I’m focused on what I’m crafting, not on data management. I can be like that because I’ve got a great team around me that is interested and curious about the tech.

I have no curiosity in the technology at all. It just allows me to do my work efficiently. We cut on Adobe Premiere, and we have done for quite a few movies in a row. It is an excellent tool for us — being able to share and pass back and forth multiple projects quickly and effortlessly.

You’ve cut so many of Fincher’s films, but this was a very different type of project. What was the appeal?
I was very excited about this film. There’s a streamlined simplicity in the approach that I think is quite opposite to a lot of movies being done right now in this type of genre. And it felt somewhat punk rock to strip it back and present a revenge film that applied the rules of gravity to its action.

Finally, what’s next? Have you got another project with him on the horizon?
David’s always sitting on a bunch of eggs waiting for one to hatch. They all have their own incubation speed. I try not to badger him too much about what’s coming until we know it’s in the pipeline.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

The Killer

Giving Color and Darkness to David Fincher’s The Killer

The Killer is David Fincher’s latest action/thriller movie based on the French graphic novel series of the same name. Starring Michael Fassbender and Tilda Swinton, the film follows an assassin who gets embroiled in an international manhunt after a hit goes wrong.

The Killer

Colorist Eric Weidt

The movie, which made its premiere at the 80th annual Venice Film Festival and is available on Netflix starting November 10, featured another collaboration between cinematographer Erik Messerschmidt, ASC; director David Fincher; and colorist Eric Weidt. Weidt has been working with Fincher since 2014, and the trio previously joined forces on Mank and the Mindhunter series.

A US and French citizen, freelance colorist Weidt has spent the past 15 years in Paris working with fashion photographers and filmmakers for the likes of Vogue and Pop Magazine, among others.

“I began with Fincher doing beauty work in Nuke in 2014,” Weidt says. “I showed him a reel of color work I’d done while working in fashion in Paris, and I think he figured that squared perfectly with the kind of precision he wanted to get into by grading his own projects in-house.”

LUTs
As with all his collaborations with Fincher, Weidt was involved right from preproduction, working closely with Messerschmidt on test footage and creating LUTs.

“We created the show LUTs together for shooting the different locations, which we drove home in the grade,” Weidt explains. “I started working with test footage early on, especially onstage, to process green/bluescreen material. I prepared LED wall and LED direct projection material that is used simultaneously.”

The movie is set across multiple locations, including Paris, the Dominican Republic and Chicago, which all required distinct looks.

“There’s Paris at night, with tungsten street lighting; there’s the Dominican Republic, with a misty-humid-warm look going on; and there’s this winter-white Chicago material,” explains Weidt. “And Erik wanted to try using a halation filter (Scatter) on a lot of the subtropical location material and on some snow-mist scenes. It worked wonderfully.”

The KillerThe filmmakers chose the Red V-Raptor [8K W and XL] camera along with the Komodo.

The Look
Se7en screenwriter Andrew Kevin Walker also reteamed with Fincher on The Killer script, which was adapted from the graphic novel by Alexis Matz Nolent. “I think Fincher is a big comic book fan,” Weidt says. “Especially in the art of framing the drawn cells to impart the beats of a story. Working in film, I think he brings that to movement as well – he’s seeking a kind of effortless visual rhythm that makes you forget that it is highly constructed.”

The look of the movie was inspired by Jean-Pierre Melville’s Le Samourai. Le Samourai is similar to The Killer,” explains Weidt. “Especially the mood and ‘grey precision,’ which is possibly what Fincher was inspired by. But nowadays we have many more tools, including HDR and cameras that practically have night vision. So, dynamically, we’re able to have something that is super-rich and subtle. David likes to push and pull colors, but he always strikes an unconscious balance. The audience feels like the story is methodical and deliberate — only going out of control at precise moments.”

Weidt recalls how they started with a “yellow-blue split in Paris, a saturated warm-chocolate shadows look in the Dominican Republic and an ice-cold northern US look,” which evolved according to the needs of each scene.

“There was an extended fight scene where black was the modus operandi,” he says. “We wanted to push for detail, all the while knowing that more darkness would make it scarier. So we needed to strike a perfect balance.”

In terms of his toolbox, Weidt has been grading on FilmLight Baselight for 7 years. “On the grading of The Killer, multipaste got used a lot because I took whole scenes and ordered them according to camera angle so that I could balance them with impunity. I also used EXR alpha channels a ton. I asked for multiple passes of some of the CG work in Paris too, so that I could make sure every element was consistent.”

Weidt says the best thing about Baselight is its “organizational prowess” and he is looking forward to using the tools in the forthcoming Baselight 6.0 release. “You can wrangle shots and scenes in no time, group grade, multi-paste, 2-3-4-6-9 up views,” he explains. “Although I did not use it on The Killer, my very favorite thing about Baselight is the new version, 6.0. You’ve got Chromogen, X Grade and myriad new features.”

Dolby Vision
The master grade for The Killer is done in PQ P3 D65 @ 1000 nits, and Weidt derived both the Rec. 709 SDR and the DCI-P3 theatrical trim passes using Dolby Vision per-shot analysis.

“This was the first time I’d done a theatrical with the Dolby Vision 48-nit transform, which came out a year or two ago and is now incorporated into Baselight,” Weidt comments. “It worked great and got me 90% of the way there. The rest is dosing each scene with the Dolby Trim and then fine-tuning per shot, if and when it’s needed.

“I did almost all of the grade on a CLED wall that simulated the 1×1000 contrast ratio I’d be getting on a projector. Once projected on Xenon bulb, I made a slight contrast increase, and we were done.”

Weidt spent a year on the grade for this project, and one of the most challenging parts was creating a realistic feel from composited material. “We used defocus a lot to drive home the right amount of depth, which can improve with subtle changes — sometimes using the alphas provided by the VFX vendors, sometimes arbitrarily.”

Weidt says he is proud of the project and his work with Fincher. “He’s told me post production is his favorite part of filmmaking. He may have been joking, but I take it seriously. I think some scenes in The Killer really sing, colorwise, and I have to say, when working with Fincher and Erik Messerschmidt, that’s not hard to achieve.”

 

Ollin VFX Delivers Mank Shots During Lockdown

Mexico City-based Ollin VFX specializes in creating computer-generated imagery for television and film. Run by co-founder/president Alejandro Diego, the company works with clients worldwide to provide concept design, previz and postviz, 2D compositing, 3D/CG effects and animation.

Alejandro Diego

Work is spread out between Ollin’s headquarters in Mexico City and artists in Argentina, Brazil, Colombia, India and Hollywood. Their visual effects work can be seen in films like Jumanji, Godzilla, Deadpool and The Curious Case of Benjamin Button.

When COVID-19 necessitated a lockdown that required their artists work from home, they were faced with how to meet fast-approaching deadlines for two major projects: an Amazon-original zombie TV series called Operation 8888 and David Fincher’s Mank, a movie about screenwriter Herman J. Mankiewicz’s development of the 1941 classic Citizen Kane.

Mank was shot black and white in 6K resolution — much higher quality than the usual 2K or 4K — which made the creation of visual effects creatively challenging due to the variations in shades of gray.

“Immediately, all of our artists started working from home and connecting via VPN to their workstation at the office,” explains Diego. When the lockdown happened in March, the studio needed to figure out a reliable and fast way for everyone to easily access and work on files ranging in size from a few MB to several GB from multiple remote locations.

Ollin had just started using DEI’s ExpeDat, accelerated file transport software, a few months before COVID-19 hit. When the shutdown happened, Diego accepted an offer from DEI to increase capacity to their existing ExpeDat license, from four to eight and at no additional cost, to help accommodate their increased demand for ingest and other file transfer activities. Having this kind of workflow setup helped Ollin as they created and delivered VFX for Mank. We reached out to Diego to find out more about his studio’s workflow.

             

How early did you get involved on Mank?
In February 2020, we started talking, communicating, bidding and discussing what Ollin VFX would be doing for the Mank production.

Can you describe the project and how many shots — and what kind they were?
We did 269 shots that ended up being used in the film. Every take — with durations that vary from three to four seconds – was organized into sequences and then into episodic. By the time we got involved to do the visual effects, the final cut of the movie is locked.

Can you describe the VFX shots?
We did a lot of set extensions. Since Mank is a period piece reflecting the 1930s, we had to recreate the MGM lot and create a lot of little things. There is a scene — a dinner in the main character’s home mansion — for which we created the fire in the fireplace and smoke from the chimney.

       

The visual effects we created for Mank were all photorealistic in the sense that they were not meant to be created for an action or sci-fi film. The whole idea is that you’d never know visual effects were used. The set extensions included sky replacements and set enhancements. In some cases, we removed modern objects like a lamp.

It was challenging because the film is black and white, so we had to adjust how we work. Not having color does represent challenges when trying to isolate objects. This is why we usually use greenscreen, but for this film we did not have that luxury — everything was rotoscoped.

You finished adding the visual effects in September and the film was available on Netflix mid-November. How was that quick turnaround possible?
Normally, we go through several iterations of the visual effects with guidance from the director. Director David Fincher knows exactly what he wants. Some visual effects are simpler — fixed and done. The great thing about us working with Fincher for 13 years (on projects such as House of Cards, Gone Girl, Girl With the Dragon Tattoo and Mindhunter) is we know his taste and what he expects, which helps a lot.

What tools did you use?
For compositing we used Foundry Nuke. For dynamics we used SideFX Houdini. These were our two main pieces of software. ExpeDat was used for sending and receiving data from all of our artists. This saved us a lot of time because we could all work directly from the ExpeDat server without having to move data to the cloud and back down again. As for hardware, we use all Linux workstations that we built.

Can you walk us through getting your artists up and running in March to work remotely?
We were lucky — we had just contracted to install a 1Gb high-speed dedicated internet line for data transfer, so we were able to convert that for artists to access our office remotely. Using VPN and fast internet access, we had our artists easily log into the local workstation in office. All they needed was a monitor and a mouse, just as if they were in the office. The set-up is secure because files never leave Ollin’s servers in Mexico City.

What were some of the challenges in working this way?
The main issue was communication between all the artists and the VFX supervisor. We all had to learn to live in a world communicating through Zoom meetings and Google Meet. That was the most challenging, but the artists got used to working remotely quickly.

When COVID was hitting Europe, we were starting to send people home before it was mandated because we knew it was coming. We were fortunate that we never had any issues with people getting sick at the office.

Any shots in particular that were most challenging, and why?
One scene that was particularly challenging was the set extension for the MGM lot. We had to create the period signage on the lot, the buildings in the background from that time period and recreate the LA skyline from that era so it was accurate spanning several years in the late 1930s.

Do you expect to keep working this way even after COVID?
Yes. It will be a mixture of how we used to work and how we work now. People prefer to be in the office, but if they have a long commute, working remotely opens new possibilities we did not take seriously before.

It would be great if everyone was in the office one or two days a week to connect directly with the team and share ideas. I personally like to be face-to-face with the team because I think certain communication is lost when it’s just done in Zoom meetings. But remote work opens a world of possibilities to attract and work with experienced talent no matter where they are located.

In general, working on Mank was really amazing and a pleasure. It is such a great movie in terms of all the details — the style of it — the fact it was shot in black and white.  We had some experience with black and white after working on the film Roma in 2018.

Kirk Baxter Talks Editing Workflow on David Fincher’s Mank

By Oliver Peters

David Fincher’s Mank follows Herman Mankiewicz during the time he was writing the classic film Citizen Kane. Mank, as he was known, wrote or co-wrote about 40 films, often uncredited, including the first draft of The Wizard of Oz. Together with Orson Welles, he won an Oscar for the screenplay of Citizen Kane, but it’s long been disputed whether or not he, rather than Welles, actually did the bulk of the work on the screenplay.

Kirk Baxter

The script for Mank was penned decades ago by David Fincher’s father Jack, and was brought to the screen thanks to Netflix this past year. Fincher deftly blends two parallel storylines: Mankiewicz’s writing of Kane during his convalescence from an accident and his earlier Hollywood experiences with the studios, as told through flashbacks.

Fincher and director of photography Erik Messerschmidt, ASC, (Mindhunter) used many techniques to pay homage to the look of Citizen Kane and other classic films of the era, including shooting in true black-and-white with Red Helium 8K Monochrome cameras and Leica Summilux lenses. Fincher also tapped other frequent collaborators, including Trent Reznor and Atticus Ross for the score and Oscar-winnitng editor Kirk Baxter, ACE, who won for the Fincher films The Girl With The Dragon Tattoo and The Social Network.

I recently caught up with Baxter, who runs Exile Edit, to discuss Mank — starring Gary Oldman in the main role — the fourth film he’s edited for David Fincher.

Citizen Kane is the 800-pound gorilla. Had you seen that film before this or was it research for the project?
I get so nervous about this topic because with cinephiles, it’s almost like talking about religion. I had seen Citizen Kane when I was younger, but I was too young to appreciate it. I grew up on Star Wars, Indiana Jones and Conan the Barbarian, then I advanced my tastes to the Godfather films and French Connection. Citizen Kane is still just such a departure from all of that. I was kind of like, “What?” That was probably in my late teens.

I went back and watched it again before the shoot and after reading the screenplay. There were certain technical aspects to the film that I thought were incredible. I loved the way Orson Welles chose to leave his scenes by turning off lights like it was in the theater. There was this sort of slow decay and I enjoy how David picked up on that and took it into Mank. Each time one of those shots came up in the bungalow scenes, I thought it was fantastic.

In regard to how close David took the stylings [of that era], well, that was more his tightrope walk. So, I felt no shackling to slow down an edit pace or stay in masters or stay in 50-50s as might have been common in the genre. I used all the tools at my disposal to exploit every scene the best I could.

Since you are cutting while the shooting goes on, do you have the ability to ask for coverage that you might feel is missing?
I think a little bit of that goes on, but it’s not me telling Fincher what’s required. It’s me building assemblies and giving them to David as he’s going, and he will assess where he’s short and where he’s not. When you’re with someone with the ability that Fincher has, then I’m in a support position of helping him make his movie as best he can. Any other way of looking at it is delusional. But I take a lot of pride in where I do get to contribute.

Mank is a different style of film than Finchers previous projects. Did that change the workflow or add any extra pressure?
I don’t think it did for me. I think it was harder for David. The film was in his head for so many decades, and there were a couple of attempts to make it happen. Obviously, a lot changes in that time frame. So, I think he had a lot of internal pressure about what he was making. For me, I found the entire process to be really buoyant, bubbly and fun.

As with all films, there were moments when it was hard to keep up during the shoot, and definitely moments coming down to that final crunch. That’s when I really put a lot of pressure on myself to deliver cut scenes to David to help him. I felt the pressure of that, but my main memory of it really was one of joy. Not that the other movies aren’t, but I think sometimes the subject matter can control the mood of the day. For instance, in other movies, like Dragon Tattoo, the feeling was a bit like your head in a vise when I look back at it.

Dragoon Tattoo is dark subject matter. On the other hand, Gary Oldmans portrayal of Mankiewicz really lights up the screen.
Right. I loved all the bungalow scenes. I thought there was so much warmth in those. I had so much compassion for the lead character, Mank. Those scenes really made me adore him. But also when the flashback scenes came, they’re just a hoot and great fun to put together. There was this warmth and playfulness of the two different opposing storylines. No matter which one turned up, I was happy to see it.

Was the inter-cutting of those parallel storylines the way it was scripted? Or was that a construction in post?
Yes, it was scripted that way. There was a little bit of pulling at the thread later. Can we improve on this? There was a bit of reshuffling later on, and then working out that “as written” was the best path. We certainly kicked the tires a few times. After we put the blueprint together, mostly the job became tightening and shortening.

This was a true black-and-white film shot with modified, monochrome Red cameras. So not color and then changed to black-and-white in the grade. Did that impact your thinking in how to tackle the edit?
For the first 10 minutes. At first you sit down and you go, “Oh, we work in black-and-white.” And then you get used to it very quickly. I forwarded the trailer when it was released to my mother in Australia. She texted back, “It’s black-and-white??” [Laughs.] You’ve got to love family!

Backgrounds and some sets used visual effects but also classic techniques, like rear projection. What about the effects in Mank?
As in most of David’s movies, they’re everywhere, and a lot of the time it looks invisible, but things are being replaced. I’d say almost half the movie. We’ve got a team that’s stabilizing shots as we’re going. We’ve got an in-house visual effects team that is building effects just to let us know that certain choices can be made.

The split screen thing is constant, but I’ll do a lot of that myself. I’ll do a fairly haphazard job of it and then pass it on for our assistant editors to follow up on. Even the montage kaleidoscope effect was all done in-house down the hall by Christopher Doulgeris, one of our VFX artists. A lot of it is farmed out, but a fair slice is done under the roof.

You used Adobe Premiere Pro to cut this film. Can you talk about that?
It’s best for me not even to attempt to answer technical questions. I don’t mind exposing myself as a luddite. My first assistant editor, Ben Insler, set it up so that I’m able to move the way I want to move. For me, it’s all muscle memory. I’m hitting the same keystrokes that I was hitting back when we were using Avid. Then I crossed those keys over to Final Cut and then over to Premiere Pro.

In previous versions, Premiere Pro required projects to contain copies of all the media used in that project. As you would hand the scene off to other people to work on in parallel, all the media would travel into that new project, and the same was true when combining projects back together to merge your work. You had monstrously huge projects with every piece of media, and frequently duplicate copies of that media, packed into them. They often took 15 minutes to open. Now Adobe has solved that and streamlined the process. They knew it was a massive overhaul, but I think that’s been completely solved. Because it’s functioning, I can now purely concentrate on the thought process of where I’m going in the edit. I’m spoiled with having very technical people around me so that I can exist as a child. [Laughs.]

How was the color grade handled?
We had Eric Weidt working downstairs at Fincher’s place on Baselight. David is really fortunate that he’s not working in this world of “Here’s three weeks for color. Go into this room each day, and where you come out is where you are at.” There’s an ongoing grade that’s occurring in increments and traveling with the job that we’re doing. It’s updated and brought into the cut. We experience editing with it, and then it’s updated again and brought back into the cut. So it’s this constant progression.

In the past you’ve said your method of organizing a selects reel was to string out shots in the order of wide shots, mediums and closeups. Then you bump up the ones you like and reduce the choices before those were presented to David as possible selects. Did you handle it the same way on Mank?
Over time, I’ve streamlined that further. I’ve found that if I send something that’s too long while he’s in the middle of shooting that he might watch the first two minutes of it, give me a couple of notes of what he likes and what he doesn’t like, and move on. So, I’ve started to really reduce what I send. It’s more cut scenes with some choices. That way I get the most relevant information and can move forward.

With scenes that are extremely dense, like Louis B. Mayer’s birthday party at Hearst’s, it really is an endless multiple choice of how to tackle it. I’ll often present a few paths. Here’s what it is if I really hold out these wides at the front and I hang back for a bit longer. Here’s what it is if I stay more with Gary [Oldman] listening. It’s not that this take is better than the other take, but more options featuring different avenues and ways to tell the story.

I like working that way, even if it wasn’t for the sake of presenting it to David. I can’t watch a scene that’s that dense and go, “Oh, I know what to do.” I wouldn’t have a clue. I like to explore it. I’ve got to turn the soil and snuff the truffles and try it all out. And then the answers present themselves. It all just becomes clear. Unfortunately, the world of the editor, regardless of past experiences, is always destined to be filled with labor. There is no shortcut to doing it properly.

Were you able to finish Mank before the virus-related lockdowns started? Did you have to move to a remote workflow?
The shooting had finished, and we already had the film assembled. I work at a furious rate while David’s shooting so we can interface during the shoot. That way he knows what he’s captured, what he needs, and he can move on and strike sets, release actors, etc. There’s this constant back and forth.

At the point when he stops shooting, we’re pretty far along in terms of replicating the original plan, the blueprint. Then it’s what I call the sweeps, where you go back to the top and just start sweeping through the movie, improving it. I think we’d already done one of those when we went remote. So it was very fortunate timing.

We’re quite used to it. During shooting, we work in a remote way anyway. It’s a language and situation that we’re completely used to. I think from David’s perspective, it didn’t change anything.

If the timing had been different and you would have had to handle all of the edit under remote conditions, would anything change? Or would you approach it the same way?
Exactly the same. It wouldn’t have changed the amount of time that I get directly with David. I don’t want to give the impression that I cut this movie and David was on the sidelines. He’s absolutely involved but pops in and out and looks at things that are made. He’s not a director that sits there the whole time. A lot of it is, “I’ve made this cut; let’s watch it together. I’ve done these selects; let’s watch them together.” It’s really possible to do that remotely.

I prefer to be with David when he’s shooting and especially in this one that he shot in Los Angeles. I really tried to have one day a week when we got to be together on the weekends and his world quieted down. David loves that. I would sort of construct my week’s thinking toward that goal. If on a Wednesday I had six scenes that were backed up, I’d sort of think to myself, “What can I achieve in the time frame before David’s with me on Saturday? Should I just select all these scenes, and then we’ll go through the selects together? Or should I tackle this hardest one and get a good cut of that going?”

A lot of the time I would choose — if he was coming in and had the time to watch things — to do selects. Sometimes we could bounce through them just from having a conversation of what his intent was and the things that he was excited about when he was capturing them. With that, I’m good to go. Then I don’t need David for another week or so. We were down to the shorthand of one sentence, one email, one text. That can inform me with all the fuel I need to drive cross-country.

The films back story clearly has political overtones that have an eerie similarity to 2020. I realize the script was written a while back at a different time, but was some of that context added in light of recent events?
That was already there. But it really felt like we are reliving this now. In the beginning of the shutdown, you didn’t quite know where it was going to go. The parallels to the Great Depression were extreme. There were a lot of lessons for me.

The character of Louis B. Mayer slashes all of his studio employees’ salaries to 50%. He promises to give every penny back and then doesn’t do it. I was crafting that villain’s performance, but at the same time, I run a company [Exile Edit] that has a lot of employees in Los Angeles and New York. We had no clue if we would be able to get through the pandemic at the time when it hit. We also asked staff to take a pay cut so that we could keep everyone employed and keep everybody on health insurance. But the moment we realized we could get through it six months later, there was no way I could ever be that villain. We returned every cent.

I think most companies are set up to be able to exist for four months. If everything stops dead — no one’s anticipating that — the 12-month brake pull. It was really, really frightening. I would hope that I would think this way anyway, but with crafting that villain’s performance, there was no way I was going to replicate it.


Oliver Peters is an award-winning film and commercial editor/colorist. His tech reviews, analysis, and interviews have appeared in numerous industry magazines and websites.

Deadpool’s Premiere Pro editing workflow

By Nicholas Restuccio

Director Tim Miller’s Deadpool is action-packed, vulgar (in a good way) and a ton of fun. It’s also one of the few Hollywood blockbusters to be edited entirely on Adobe’s Premiere Pro.

On the Saturday following the film’s release, Adobe hosted a panel on the Fox Studios lot that included Deadpool’s post supervisor Joan Bierman, first assistant editor Matt Carson and Adobe consultants Vashi Nedomansky and Mike Kanfer. Here are some takeaways…

Why Premiere Pro?
According to Bierman, much of the credit for choosing Premiere Pro for the edit goes to Tim Miller. “Even before we had a crew, Tim knew he wanted to do this,” she said. Miller, a first-time feature director is no stranger to technology — he is co-founder of Culver City’s Blur Studio, which specializes in visual effects and animation.

Miller’s friend, director David Fincher, is a big advocate of Adobe Premiere. It’s likely his using it to edit Gone Girl — the first feature cut with the product — inspired Miller. The rest of the credit goes to Ted Gagliano, president of post production at Fox, for giving the go ahead for the road less taken.

DEADPOOL

Training and Storage
The first step in this undertaking was getting all the editors and assistants — who were used to editing on Media Composer and Final Cut — trained on Premiere. So they brought in editor Vashi Nedomansky — a Premiere Pro workflow consultant — who spent an initial three weeks training all five editors and established the workflow. He then returned for at least 12 days during the next nine months to further refine the workflow and answer questions both technical and editorial.

Additionally, he showed them features that are unique to Premiere, such as Dynamic Linking to After Effects projects and tapping the tilde (~) key to “full screen” the workspace section. “In our shared editing environment, because the editors were all coming from an Avid workflow, we treated Premiere Pro sequences as Avid bins,” explained Nedomansky. “Because Premiere Pro only allows one open project at a time… we shared sequences like you would share bins in Avid to allow all the editors access to the latest cuts.”

The next step was to get the multi-user editorial environment set up. They wanted to have several users, assistant editors and editors, get in and start working on the film simultaneously, without crashing into each other and corrupting files.

Jeff Brue’s Open Drives provided storage for the film via its product Velocity, which delivered 180TB of solid-state storage. With 5GB/s of “normal” throughput, the team had projects that would open in less than two minutes.

DEADPOOL

The solution to the multi-user access problem was much simpler and lower tech. When someone was working on a project file, they would move it to their named directory so nobody opened it mid-edit. Then, once they were done, they moved it back. So a little discipline went a long way in making sure that sharing media in a multi-user environment was stress-free.

When they needed a sequence in a project, they were able to link to it from another Premiere project without harming the source project. All of this allowed them to keep everything, as Nedomansky put it, “contained, safe and sharable.”

Re-Framing and Multi-Format Shooting
With all this in place, the team was ready to start cutting the wide array of footage the crew was producing. The film was shot primarily on the Arri Alexa at 3.2K RAW, but footage was also captured on 5K and 6K Red cameras and at least one Phantom. All of the footage was downsized to the common container format of 2048×1152 for the offline in Premiere and encoded in ProRes LT. This allowed them to do a center extraction, which gave the director and editor the ability to re-frame when they wanted to.

For the online, they went back to the Arri RAW, or other RAW formats, depending on their needs. The center extraction gave them a lot of creative freedom, so much so that they reframed the entire movie in the online. “If I had it to do over again I would have done it [the reframing] in a cheaper room” said Bierman.

Throughout the edit, the post team was burning its way through Mac Pros — the Macs were having an issue with the ATI D700 cards in OS X. In all the team burned through 10 of the cards, which would occasionally melt down on renders.

“There were some incredibly complex reels on Deadpool,” says Kanfer.At one point midway through the production real five was taking over 10 minutes to load. Our engineers quickly regrouped and within a week were able to optimize the situation and the same reel took only 2 1/2 minutes to load once the fix was made. Other less complex reels in the film loaded in a minute or less.”

Vashi Nedomansky, Matt Carson and Joan Bierman.

The sound team had to create a slight workaround for audio turnovers. In a traditional Avid workflow, the hand off to Avid Pro Tools is relatively seamless — as you would expect since they are made by the same company — but going from Premiere required a little more effort. The package was the same as a normal sound turnover, including QuickTimes, guide tracks and EDLs, along with the AIFs. The trouble occurred when the conform wasn’t always in sync with what had been turned over.

Adobe looks at all of this as an opportunity to make their product even stronger. They said their engineers on the Adobe team “love to tackle problems and there is no better place to tackle those problem than live on an edit.”

Final Take Away
Even with training the editorial team to use a new program, working through audio conform hiccups and a pile of dead Mac towers, the team produced a polished film that had the best opening weekend for an R-rated film in history.

With improved sound turnover options hinted at for future versions of Premiere, we will very likely see more “Edited with Adobe Premiere Pro” logos in future film end credits.

FotoKem’s nextLab pushes envelope with Gone Girl’s 6K workflow

 The last piece in our Gone Girl workflow series.

By Daniel Restuccio

Back in the fall of 2013, FotoKem was prepping and packing up one of its nextLab data field systems and shipping it to a hotel in Cape Girardeau, Missouri. This particular hotel was the off-set digital asset processing hub for David Fincher’s Gone Girl, which is being released on Blu-ray/DVD on January 13 and has also garnered a considerable amount of Oscar buzz this season.

This movie represented a new chapter for the system — Gone Girl was about to become the first major feature film shot entirely on the Red Dragon at 6K and edited with Adobe Premiere Pro Creative Cloud.

Continue reading

Radical/Outpost’s Evan Schechtman talks latest FCP X updates, NLE trends

By Randi Altman

As you might have heard, Apple has updated its Final Cut Pro to version 10.1.4, with what they call “key stability improvements.”

That includes the Pro Video Formats 2.0 software update, which provides native support for importing, editing and exporting MXF files with Final Cut Pro X. While the system already supported import of MXF files from video cameras, this update extends the format support to a broader range of files and workflows.

In addition the native MXF support, there is also an option to export AVC-Intra MXF files.  There are fixes for past issues with automatic library backups. It also fixes a problem where clips Continue reading

‘Gone Girl’: Light Iron and David Fincher’s path to 6K

By Daniel Restuccio

Light Iron Post CEO Michael Cioni is an outspoken and passionate advocate of pushing the edge of post technology for the mission of getting the best images possible on screen. David Fincher’s Gone Girl represents the third movie, following The Social Network and The Girl With the Dragon Tattoo, that Cioni and his team have collaborated on with the director.

“If you are really doing something profound, you can’t solve it in one project,” shares Cioni, whose company has offices in LA and New York. With David Fincher and his team, key ideas don’t get put to the wayside after the movie finishes. “They store it, develop it and improve it, and then it’s re-deployed with more advanced technology on the next project.”

Continue reading

Pioneering 6K post workflows for ‘Gone Girl’

By Daniel Restuccio

David Fincher’s Gone Girl, shot on a Red Dragon in 6K, boasts an innovative editorial workflow that integrates top-end hardware and software into a seamless 6K editing, VFX and conforming system.

Pioneering a tightly integrated SSD shared-storage system, Gone Girl was edited on 13 custom-built workstations based on Apple Mac Pros and HP Z820s boasting Nvidia GPU cards, Fusion-io solutions, as well as customized versions of Adobe Premiere Pro Creative Cloud and After Effects Creative Cloud, and editorial managed through Open Drives media storage system.

“Basically what happened,” explains assistant editor Tyler Nelson, “is we brought Jeff (Brue, co- Continue reading

Kirk Baxter on editing David Fincher’s ‘Gone Girl’

By Daniel Restuccio

When David Fincher took on the film Gone Girl, he immediately started to round up the “usual suspects,” his most trusted collaborators, including Kirk Baxter, who has two Editing Oscars on his mantle from previous work with Fincher (The Social Network and The Girl With the Dragon Tattoo).

Gone Girl is based on Gillian Flynn’s book Gone Girl; Flynn also wrote the screenplay. The film follows Nick Dunne (Ben Affleck), a husband who either knows or doesn’t know the whereabouts of his missing wife.

Baxter began cutting Gone Girl on September 16, 2013: day two of shooting. And for the first time in three movies with Fincher, Baxter cut this one alone and without his frequent co- Continue reading

Encore gives ‘House of Cards’ moody feel in 4K Ultra HD

The second season of Netflix’s House of Cards was made available to subscribers in 4K Ultra HD. LA’s Encore handled the show’s post production, with lead colorist Laura Jans-Fazio grading in uncompressed 4K.

The show was shot on Red cameras, some using the HDR functionality for extended contrast and color dynamic range. The floating-point processing in Baselight, Jans-Fazio’s tool of choice, offered her new creative options. Windows that appeared blown out, for example, could be graded to show detail then composited into the rest of the scene. She was able to achieve the color and composite in realtime so the clients could see the final results immediately.

Laura Jans-Fazio, lead colourist, Encore

Laura Jans-Fazio

Co-producer Peter Mavromates and post supervisor Hameed Shaukat worked directly with Jans-Fazio on the grade, with director David Fincher and DP Igor Martinovic providing feedback via using the PIX digital collaboration tool.

“As episodes were completed, they were uploaded to PIX, which allowed the producer, director and DP to view content on calibrated Sony OLED monitors,” explained Morgan Strauss, Encore SVP, operations. “They returned their feedback, which we could extract directly into Baselight, and Jans-Fazio finalized the look and delivered the files to Netflix. It was essential to maximize this asynchronous collaborative process and, along with Baselight’s sophisticated toolset, it meant we could fully realize the creative needs of the producers and DP.”

Screen Shot 2014-06-24 at 10.17.17

The overall look of the series has a slightly moody feel, reflecting the tense, internal political intrigues of the story. The grade avoids over-saturated colors, maintaining the palette throughout — which was Fincher’s vision for the show.

“Baselight has so many features, and the fact that it works in floating-point processing gives me image quality for a pristine picture every time,” said Jans-Fazio. “We often used multiple shapes in a single shot, and being able to do that in one layer in Baselight was a real time-saver. We could also composite through VFX mattes, and do monitor replacements, in realtime.”