NBCUni 9.5.23

Category Archives: Color Grading

Posting Life in Six Strings With Kylie Olsson

By Oliver Peters

Whether you’re a guitar nerd or just into rock ‘n roll history, learning what makes our music heroes tick is always entertaining. Music journalist and TV presenter Kylie Olsson started a YouTube channel during the pandemic lockdown, teaching herself how to play guitar and reaching out to famous guitarists that she she knew. This became the concept for a TV series called Life in Six Strings With Kylie Olsson that airs on AXS TV. The show is in the style of Comedians in Cars Getting Coffee with Olsson exploring the passions behind these guitarists, plus gets a few guitar pointers along the way.

James Tonkin

I spoke with James Tonkin and Leigh Brooks about the post workflow for these episodes. Tonkin is founder of Hangman in London, which handled the post on the eight-part series. He was also the director of photography for the first two episodes and has handled the online edit and color grading for all of the episodes. Leigh Brooks Firebelly Films was the offline (i.e. creative) editor on the series, starting with episode three. Together they have pioneered an offline-to-online post workflow.

Let’s find out more…

James, how did you get started on this project?
James Tonkin: Kylie approached us about shooting a pilot for the series. We filmed that in Nashville with Joe Bonamassa and it formed the creative style for the show. We didn’t want to just fixate on the technical side of the guitar and tone of these players, but their geographical base — we wanted to explore the city a little bit. We had to shoot it very documentary style but wrap it up into a 20-25 minute episode. No pre-lighting, just a tiny team following her around, interacting with these people.

Then we did a second one with Nuno Bettencourt and that solidified the look of the show during those two initial episodes. She eventually got distribution through AXS TV in the States for the eight-part series. I shot the first two episodes, and the rest were shot by a US-based crew, which followed the production workflow that we had set up. Not only the look and retaining the documentary format, but also maintaining the highest production value we could give it in the time and budget that we’re working with.

We chose to shoot anamorphic with a cinematic aspect ratio, because it’s slightly different from the usual off-the-cuff reality TV look. Also whenever possible, record in a raw codec, because we (Hangman) were doing all of the post on it, and me specifically being the colorist.

I always advocate for a raw workflow, especially something in a documentary style. People are walking from daylight into somebody’s house and then down to a basement, basically following them around. And Kylie wants to keep interacting with whomever she’s interviewing without needing to wait for cameras to stop and rebalance. She wants to keep it flowing. So when it comes to posting that, you’ve got a much more robust digital negative to work with [if it was shot as camera raw].

Leigh Brooks

What was the workflow for the shows and were there any challenges?
Leigh Brooks: The series was shot mainly with Red and Canon cameras as 6K anamorphic files. Usually, the drive came to me, and I would transcode the rushes or create proxy files and then send the drive to James. The program is quite straightforward and narrative-based, without much scope for doing crazy things with it.

It’s about the nuts and bolts of guitars and the players that use them. But each episode definitely had its own little flavor and style. Once we locked the show, James took the sequence, got hold of the rushes and then got to work on the grade and the sound.

What Kylie’s pulled off on her own is no small feat. She’s a great producer, knows her stuff and really does the research. She’s so passionate about the music and the people that she’s interviewing and that really comes across. The Steve Vai episode was awesome. He’s very holistic. These people dictate the narrative and tell you where the edit is going to go. Mick Mars was also really good fun. That was the trickiest show to do because the A- and B-side camera set-up wasn’t quite working for us. We had to really get clever in the edit.

Resolve is known for its finishing and color grading tool, but you used it to edit the offline as well. Why?
Tonkin: I’ve been a longtime advocate of working inside of Resolve, not just from a grading perspective, but editorial. As soon as the Edit page started to offer me the feature set that we needed, it became a no-brainer that we should do all of our offline in Resolve whenever possible.

On a show like this, I’ve got about six hours of online time and I want to spend the majority being as creative as I can. So, focusing on color correction, looking at anything I need to stabilize, resize, any tracking, any kind of corrective work — rather than spending two or three hours conforming from one timeline into another.

The offline on this series was done in Resolve, except for the first episode, which was cut in Apple Final Cut Pro X. I’m trying to leave editors open to the choice of the application they like to use. My gentlemen’s agreement with Matt [Cronin], who cut the first pilot, was that he could cut it in whatever he liked, as long as he gave me back a .drp (DaVinci Resolve project) file. He loves Final Cut Pro X because that’s what he’s quickest at. But he also knows the pain that conforms can be. So he handled that on his side and just gave me back a .drp file. So it was quick and easy.

From Episode 3 onwards, I was delighted to know that Leigh was based in Resolve, as well, as his primary workflow. Everything just transfers and translates really quickly. Knowing that we had six more episodes to work through together, I suggested things that would help us a lot, both for picture for me and for audio as well, which was also being done here in our studio. We’re generating the 5.1 mix.

Brooks: I come from an Avid background. I was an engineer initially before ever starting to edit. When I started editing, I moved from Avid to Final Cut Pro 7 and then back to Avid, after which I made the push to go to Resolve. It’s a joy to edit on and does so many things really well. It’s become my absolute workhorse. Avid is fine in a multi-user operation, but now that doesn’t really matter. Resolve does it so well with the cloud management, and I own the two editor keyboards.

You mentioned cloud. Was any of that a factor in the post on Life in Six Strings?
Tonkin: Initially, when Leigh was reversioning the first two episodes for AXS TV, we were using his Blackmagic Cloud account. But for the rest of the episodes, we were just exchanging files. Rushes either came to me or would go straight to Leigh. He makes his offline cut and then the files come to me for finishing, so it was a linear progression.

However, I worked on a pilot for another project where every version was effectively a finished online version. And so we used Blackmagic Cloud for that all the way through. The editor worked offline with proxies in Resolve. We worked from the same cloud project and every time he had finished, I would log in and switch the files from proxy to camera originals with a single click. That was literally all we had to do in terms of an offline-to-online workflow.

Brooks: I’m working on delivering a feature-length documentary for [the band] Nickelback that’s coming out in cinemas later in March. I directed it, cut it in Avid, and then finished in Resolve. My grader is in Portsmouth, and I can sit here and watch that grade being done live, thanks to the cloud management. It definitely has a few snags, but they’re on it. I can phone up Blackmagic and get a voice — an actual person to talk to that really wants to fix my problem.

You’ve both worked with a variety of other nonlinear editing applications. How do you see the industry changing?
Tonkin: Being in post for a couple of decades now and using Final Cut Studio, Final Cut Pro X and a bit of Premiere Pro throughout the years, I find that the transition from offline to online starts to blur more and more these days. Clients watching their first pass want to get a good sense of what it should look like with a lot of finishing elements in place already. So you’re effectively doing these finishing things right at the beginning.

It’s really advantageous when you’re doing both in Resolve. When you offline in a different NLE, not all of that data is transferred or correctly converted between applications. By both of us working in Resolve, even simple things you wouldn’t think of, like timeline markers, come through. Maybe he’s had some clips that need extra work. He can leave a marker for me and that will translate through. You can fudge your way through one episode using different systems, but if you’re going to do at least six or eight of them — and we’re hopefully looking at a season two this year — then you want to really establish your workflow upfront just to make things more straightforward.

Brooks: Editing has changed so much over the years. When I became an engineer, it was linear and nonlinear, right? I was working on the James Bond film, The World Is Not Enough, around 1998. One side of the room was conventional — Steenbeck’s, bins, numbering machines. The other side was Avid Media Composer. We were viewing 2K rushes on film, because that’s what you can see on the screen. On Avid it was AVR-77. It’s really interesting to see it come full circle. Now with Resolve, you’re seeing what you need to see rather than something that’s subpar.

I’d say there are a lot of editors who are “Resolve curious.” If you’re in Premiere Pro you’re not moving [to a different system], because you’re too tied into the way Adobe’s apps work. If you know Premiere, you know After Effects and are not going to move to Resolve and relearn Fusion. I think more people would move from Avid to Resolve, because simple things in Resolve are very complicated in Avid — the effects tab, the 3D warp and so on.

Editors often have quite strange egos. I find the incessant arguing between platforms is just insane. It’s this playground kind of argument about bloody software! [laugh] After all, these tools are all there to tell stories.


Oliver Peters is an award-winning editor/colorist working in commercials, corporate communications, television shows and films.

Perinno

ColorNation Adds Colorists Mary Perrino and Ana Rita

Remote color service ColorNation has added two new colorists to its roster — Mary Perrino and Ana Rita.

Perrino is a veteran New York-based colorist who has worked out of her own studio, La Voglia, for nearly a decade. Initially trained as a cinematographer at NYU’s Tisch School of the Arts, she segued into color as her interest in post grew. Collaborating on everything from indie features to commercials, her work as a color artist took on a life of its own, allowing her to elevate but not overpower the visuals with which she’s entrusted. With spots for brands like Tiffany, DKNY, Steve Madden, Pink, Canon and Google on her reel, ColorNation marks her first representation agreement.

Rita has worked in post production for almost a decade. Based in her native Portugal, she’s worked on short films and commercials, handling assignments from some of the largest agencies in the world and for some of the biggest global brands. During a stint in New York, she worked on several long-form projects, including an indie feature and the YouTube series Made in America.

With a strong representation in food and beverage work, Rita is also adept at lifestyle and fashion spots and has done work in the music video space, with evocative grading seen in videos for indie singer and guitarist Rorey and brightly lit work for the rising jazz fusion saxophone star Grace Kelly.

“Adding Mary and Ana to our roster is part of our plan to offer ColorNation clients access to a diverse talent pool, located in different regions around the world,” says founder/EP Reid Brody. “Both of these artists have amazing showreels, and their work fits perfectly with what the marketplace is looking for today – colorists with a point of view, with an understanding of how to enhance the work and with a wide range of experience in terms of content categories and visual styles.”

Perrino says she joined the roster because she views Brody’s approach to the business as being in step with the times. “What he’s doing with ColorNation is unique,” she observes. “I’ve never signed with anyone before because nothing has ever felt right. My independence is incredibly precious, and I was seeking a relationship that wouldn’t change how I do business, but rather build upon it.”

Perrino says her path to the color suite seems almost pre-ordained: “I enjoyed post-processing photographs and video from a young age,” she recalls. During her years at NYU studying cinematography, she adds, “peers appreciated my aesthetic, and soon realized a huge part of the look I was achieving was through color, so they started asking me to grade their projects.” Once she mastered Resolve, she says, “my aesthetic ideas could flow easily, and I fell even more in love with color as a craft.”

Rita came across ColorNation while researching independent color services and was already looking for a remote option that would allow her to expand her client base and the kinds of projects she was handling. A social media post from current ColorNation artist Vincent Taylor led her to Brody.

“What interests me most about color is its ability to shape the viewer’s emotions,” explains Rita. “It’s truly powerful how subtle adjustments can evoke such varied feelings. Additionally, I find the mathematical aspects fascinating, along with delving into the intricacies of different color spaces and discovering myriad tricks that can yield diverse and impactful results.”

Perrino and Rita join a ColorNation roster that includes colorists Gino Amadori, Cory Berendzen, Calvin Bellas, Yohance Brown, Ben Federman, Andrew Francis, Heather Hay, Lea Mercado, Mark Todd Osborne, Matthew Rosenblum, Vincent Taylor and Matt West.

 

 

NBCUni 9.5.23

Rodeo FX Adds Ana Escorse To Lead New Color Suite

VFX, post production, animation and experiential services provider Rodeo FX has added senior colorist Ana Escorse to lead its new color grading suite.

Escorse joins Rodeo FX from Alter Ego. Before that, she did stints at Studio Feather, Nice Shoes and Frame Discreet. She started her career in color grading as a color assistant at Sim Post (now part of Streamland Media). Escorse’s work on Lovezinho earned her the Music Video award at the 2022 FilmLight Colour Awards. Then she joined the 2023 jury panel alongside leading creatives and DPs such as Lawrence Sher, ASC; Greig Fraser, ACS, ASC; and Natasha Braier, ASC, ADF.

By adding color grading to its roster of services, Rodeo FX can now serve its clients’ projects from start to finish. The new suite, located in Toronto, is equipped with FilmLight Baselight. Escorse, who has been using Baselight for many years, will work either remotely or on-site in Toronto.

“Baselight is widely recognized and respected in the film and television industry and allows me to offer our clients and collaborators the most advanced features and highest quality image processing available in post,” Escorse says. “FilmLight’s commitment to continuously developing new technologies and features as well as Baselight’s customization and control make it a very efficient and reliable tool, allowing me to focus on the creative process and client collaboration.”


HPA Tech Retreat 2024: Networking and Tech in the Desert

By Randi Altman

Late last month, many of the smartest brains in production and post descended on the Westin Rancho Mirage Golf Resort & Spa in Palm Springs for the annual HPA Tech Retreat. This conference is built for learning and networking; it’s what it does best, and it starts early. The days begin with over 30 breakfast roundtables, where hosts dig into topics — such as “Using AI/ML for Media Content Creation” and “Apprenticeship and the Future of Post” — while the people at their table dig in to eggs and coffee.

Corridor Digital’s Niko Pueringer

The day then kicks further into gear with sessions; coffee breaks inserted for more mingling; more sessions; networking lunches; a small exhibit floor; drinks while checking out the tools; dinners, including Fiesta Night and food trucks; and, of course, a bowling party… all designed to get you to talk to people you might not know and build relationships.

It’s hard to explain just how valuable this event is for those who attend, speak and exhibit. Along with Corridor Digital’s Niko Pueringer talking AI as well as the panel of creatives who worked on Postcard from Earth for the Las Vegas Sphere, one of my personal favorites was the yearly Women in Post lunch. Introduced by Fox’s Payton List, the panel was moderated by Rosanna Marino of IDC LA and featured Daphne Dentz from Warner Bros. Discovery Content Creative Services, Katie Hinsen from Marvel and Kylee Peña from Adobe. The group talked about the changing “landscape of workplace dynamics influenced by #metoo, the arrival of Gen Z into the workforce and the ongoing impact of the COVID pandemic.” It was great. The panelists were open, honest and funny. A definite highlight of the conference.

We reached out to just a few folks to get their thoughts on the event:

Light Iron’s Liam Ford
My favorite session by far was the second half of the Tuesday Supersession. Getting an in-depth walk-through of how AI is currently being used to create content was truly eye-opening. Not only did we get exposed to a variety of tools that I’ve never even heard of before, but we were given insights on what the generative AI components were actually doing to create these images, and that shed a lot of light on where the potential growth and innovation in this process is likely to be concentrated.

I also want to give a shoutout to the great talk by Charles Poynton on what quantum dots actually are. I feel like we’ve been throwing this term around a lot over the last year or two, and few people, if any, knew how the technology was constructed at a base layer.

Charles Poynton

Finally, my general takeaway was that we’re heading into a bit of a Wild West over the next three years.  Not only is AI going to change a lot of workflows, and in ways we haven’t come close to predicting yet, but the basic business model of the film industry itself is on the ropes. Everyone’s going to have to start thinking outside the box very seriously to survive the coming disruption.

Imax’s Greg Ciaccio
Each year, the HPA Tech Retreat program features cutting-edge technology and related implementation. This year, the bench of immensely talented AI experts stole the show.  Year after year, I’m impressed with the practical use cases shown using these new technologies. AI benefits are far-reaching, but generative AI piqued my interest most, especially in the area of image enhancement. Instead of traditional pixel up-rezing, AI image enhancements can use learned images to embellish artists’ work, which can iteratively be sent back and forth to achieve the desired intent.

It’s all about networking at the Tech Retreat.

3 Ball Media Group’s Neil Coleman
While the concern about artificial intelligence was palpable in the room, it was the potential in the tools that was most exciting. We are already putting Topaz Labs Video AI into use in our post workflow, but the conversations are what spark the most discovery. Discussing needs and challenges with other attendees at lunch led to options that we hadn’t considered when trying to get footage from field back to post. It’s the people that make this conference so compelling.

IDC’s Rosanna Marino
It’s always a good idea to hear the invited professionals’ perspectives, knowledge and experience. However, I must say that the 2024 HPA Tech Retreat was outstanding. Every panel, every event was important and relevant. In addition to all the knowledge and information taken away, the networking and bonding was also exceptional.

Picture Shop colorist Tim Stipan talks about working on the Vegas Sphere.

I am grateful to have attended the entire event this year. I would have really missed out otherwise. The variation of topics and how they all came together was extraordinary. The number of attendees gave it a real community feel.

IDC’s Mike Tosti
The HPA Tech Retreat allows you to catch up on what your peers are doing in the industry and where the pitfalls may lie.

AI has come a long way in the last year, and it is time we start learning it and embracing it, as it is only going to get better and more prevalent. There were some really compelling demonstrations during the afternoon of Supersession.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 25 years. 


GoPro Hero12

Review: GoPro Hero12 Black Action Camera

By Brady Betzel

The updated GoPro Hero12 Black introduces a few features that make it a must-buy for very specific professional-level users. I love it when GoPro releases updates to its cameras and software. It’s always a step forward in quality and features while keeping the familiar form factor that has made GoPro the go-to action camera for years. The GoPro Hero12 Black is no exception, with features like the new GP-Log color profile and wireless audio recording. It’s even better when you bundle it with the Max Lens Mod 2.0.

GoPro Hero12

Whether you are mounting dozens of GoPros on loaders and excavators with an eye toward syncing Avid Media Composer later, or you need to closely match color between the Hero12 and a Blackmagic RAW clip, the Hero12 Black is an upgrade you’ll want to consider if you are a pro looking to streamline your workflow. And if you haven’t already subscribed to the GoPro Premiere subscription service, grab yourself a year subscription for the sale price of $24.99.

GoPro Hero12 Black Edition Upgraded Specifications

  • Mounting – Built-in mounting with folding fingers¼-20 mount
  • Image sensor – 1/1.9″ CMOS – 27.6 MP active pixels(5599×4927)
  • Lens Aperture – F2.5
  • FOV – 156° in 8:7 aspect ratio (35mm Equivalent Focal Length)
    • Min = 12mm
    • Max = 39mm
  • Video Resolutions and Frame Rates
    • 3K (8:7) 30/25/24 fps5.3K (16:9) 60/50/30/25/24 fps4K (8:7) 60/50/30/25/24 fps4K (9:16) 60/50/30/254K (16:9) 120/100/60/50/30/25/24 fps2.7K (4:3) 120/100/60/50 fps2.7K (16:9) 240/200 fps1080 (9:16) 60/50/30/251080p (16:9) 240/200/120/100/60/50/30/25/24 fps
  • Video stabilization – HyperSmooth 6.0
  • Aspect ratio – 16:9 9:16 4:3 8:7
  • HDR video – 5.3K (16:9) 30/25/24 fps4K (8:7) 30/25/24 fps4K (16:9) 60/50/30/25/24 fps
  • Video compression standard – H.265 (HEVC)
  • Color video bit depth – 8-bit/10-bit (4K and higher)
  • Maximum video bit-rate – 120Mbps
  • Zoom (Video) – Up to 2x
  • Slo-Mo – 8x – 2.7K; 1080p4x – 4K2x – 5.3K
  • Live streaming – 1080p60 with HyperSmooth 4.0 + 1080p60 recording
  • Webcam mode – up to 1080p30
  • Timecode synchronization – Yes
  • Wireless AudioSupport for AirPods and other Bluetooth headsets
  • GP-Log encoding with LUTs

There are a lot of specs in this tiny little GoPro hardware. But as I mentioned earlier, the Hero 12 Black has a few very specific features that pros and semi-pros should really love.

Let’s dig in…

GP-Log Color Profile
First up is the highly sought after (at least by me) GP-Log color profile. I am an online editor, so I deal with video finishing and color correction. From painting out camera crews to stabilizing to noise reduction, I try to make the end product as flawless as possible before it goes to air. So cameras with low noise floors, low moiré and natural-looking stabilization go a long way in my book.

GoPros have been a staple in docuseries and unscripted television shows for years. They can be easily hidden in cars for OTF interviews or discussions between cast members or even buried in the snow to catch a wild animal walking by. If the camera breaks, it’s not the end of the world because they are reasonably priced. The hard part has always been matching the look of an action-cam like a GoPro to that of a higher-end camera system that uses full-frame sensors and multi-thousand-dollar lenses. GoPro has attempted to make that a little easier with the newly added GP-Log color profile.

A Log color profile is a way for the camera to record more steps in dynamic range (think highlights that don’t blow out or shadows that retain details). Log profiles are not meant to be used by everyday filmmakers because, at times, it can be tricky to color-correct Log profiles correctly versus recording in standard Rec. 709 color space or even in the GoPro HDR color profile. Pros use Log profiles to aid in camera color and aesthetic-matching with the hope of giving the audience a more filmic feel, with more details in shots with high contrast. This helps the audience not to notice a change from an ARRI Alexa Amira to a GoPro Hero12 Black, for example.

As I was working with the GoPro Hero12 Black footage in Blackmagic’s DaVinci Resolve 18.6.5, I was monitoring the footage on a large OLED monitor through a Blackmagic DeckLink 4K Extreme over HDMI. Looking at GoPro footage on a phone or a small tablet does not give the entire story. It is essential to view your footage through proper I/O hardware on a professional monitor — preferably color-calibrated. Otherwise, you might miss crucial issues, like noise in the shadows.

GoPro Hero12

In addition, on the same computer but with a separate screen, I monitored the video signal using Nobe’s OmniScope 1.10.117. OmniScope is an amazing software-based scope that can be used in conjunction with your nonlinear editor or color-correcting software like Resolve. It is giving hardware scopes a huge run for their money these days, and I wouldn’t be surprised if these types of scopes took over. My base computer system includes an AMD Ryzen 9 5950X processor, an Asus ProArt motherboard, 64GB RAM and an Nvidia RTX 4090 Founder’s Edition GPU.

How well does the new GoPro Hero12 Black Edition’s GP-Log color profile work? When looking at footage shot in GP-Log through color scopes, there is more detail retained in the shadows and highlights, but it really isn’t enough to warrant the extra work to get there. Instead, if you turn down the sharpness in GoPro’s HDR mode, you can get to a similar starting point as something shot in GP-Log. Aside from that, one of the benefits of using GP-Log and applying the GoPro LUT is the ability to color “behind the LUT” to expand the highlights or dial in the shadows. But again, I didn’t see as much value as I had hoped, and I tested color in both DaVinci Wide Gamut and Rec. 709 color spaces. The biggest letdown for me was that the GP-Log footage appeared less detailed than HDR or a standard color profile. And it wasn’t as simple as just increasing the sharpness to match. There is something odd about it; the colors seemed “dense,” but the footage felt soft. I just don’t think the GoPro GP-Log color profile is the panacea I was hoping it would be. Maybe future updates will prove me wrong. For now, the HDR mode with low sharpness seems to be a sweet spot for my work.

Syncing Cameras Via Timecode
Another update to the GoPro Hero12 Black that I was excited to see is the ability to sync cameras via timecode. Maybe 10 or 12 years ago, one of the banes of my existence as an assistant editor was transcoding footage from MP4 to a more edit-friendly codec, like ProRes or DNxHD. This would not only help slower editing systems work with the hundreds of hours of footage we received, but it would also insert actual timecode and tape names/IDs into the clips.

This is a crucial step when working in a traditional offline-to-online workflow process. If you skip this step, it can quickly become a mess. The GoPro Hero12 Black inserts timecode into the file to help with syncing and auto-syncing cameras in your favorite NLE, like Adobe Premiere Pro, Media Composer, Apple FCPX or Resolve. You’ll still need to force a proper tape name/camera name/tape ID to clearly distinguish clips from differing dates/times, but with faster computers, the addition of actual timecode could help eliminate a lot of transcoding.

What’s really smart about GoPro’s timecode sync is the workflow. Jump into the Quik app, find a Hero12 that you want to sync, click the three-dot drop-down menu, click “Sync Timecode” and, while turned on, it will show the QR code to the GoPro Hero12 Black. Once recognized, you will get a verification on the GoPro that it has been synced. And that’s it! While this feature is a long time coming, it is a welcome addition that will save tons of time for professional creators who run dozens of cameras simultaneously.

Other Updates
Finally, there are a couple of minor updates that also caught my eye. The addition of the ¼-20 mount between the GoPro folding finger mounts is a huge structural change. It’s something that should have been there from the beginning, and it’s nice not to have to purchase GoPro-specific mounts all the time.

Another great update is the ability to pair AirPods or other Bluetooth audio devices for wireless sound recording and voice control. Keep in mind that when using Bluetooth earbuds with built-in microphones, any noise reduction built into the headphones will be hard-coded into the recorded audio file. But hand it to GoPro to record two channels of audio when using a Bluetooth earbud mic. This way, if your wireless mic signal drops out, you won’t be out of luck. The GoPro’s built-in mic will still be recording.

On the accessory front, if you purchase the newest Max Lens Mod 2.0 with the GoPro Hero12 Black, you’ll be able to take advantage of a few new features. Besides the larger 177-degree field of view when shooting 4k at 60fps, GoPro recently released a software update that allows for using the Max Lens Mod 2.0 in Linear lens mode. This means no fish-eye look! So in addition to the HyperView and SuperView recording modes, you can get an even larger field of view than the standard GoPro Hero12 Black lens in Linear mode.

Something to keep in mind: You cannot record in the GP-Log color profile when using the Max Lens Mod 2.0. Hopefully GoPro will continue to lean into the GP-Log color profile, improve the quality and dynamic range, and add it to the recording ability with the Max Lens Mod 2.0. But for now, the Max Lens Mod 2.0 is a great accessory to put on your wish list.

If the GoPro Hero12 Black is above your price range, or you aren’t sure that you want to give it to your 6-year-old to throw around on the water slide like I did, then there are a few lower-priced options that get you pretty close. The Akaso Brave 7 is waterproof for up to 30 minutes and has up to 4K/30fps video, time lapse, hyperlapse and photo-taking abilities. The Akaso Brave 7 retails for $169.99 and not only comes packed with tons of GoPro-like accessories, but also a wireless shutter remote.

While the video recording quality isn’t at the same level as the Hero12 Black, if you’re looking for a well-rounded but not quite pro-level camera, the Brave 7 might be for you. In fact, I might actually prefer the color of the Brave 7, which feels a little more accurate as opposed to the heavily saturated GoPro. Keep in mind that with lower-priced cameras like the Brave 7, the physical quality can be a little lower, and options like frame rates can be minimal. For instance, the Brave 7 does not record in 24p, lacks 10-bit and does not have the GoPro style fingers or ¼-inch 20 connection.

Summing Up
In the end, the GoPro Hero12 Black is a great update if you have an older-model GoPro… think Hero10 or earlier. And while the battery appears to last longer when recording in cold or imperfect conditions, in my tests I found that heat is still the enemy of the Hero12. Anything above 80 degrees in direct sunlight will limit your recording time. Running it for a couple of my son’s baseball games left me guessing whether I would actually be able to record full games because of the heat.

If you have a GoPro Hero11 Black, then I suggest you skip the Hero12 and grab the Media Mod accessory for your Hero11, which will add a higher-quality mic and external inputs. You could also add some sort of shade to keep your camera cool — there are a lot of interesting 3D-printed products on Etsy. The Hero12 Black no longer has a GPS, so if the graphic overlays or metadata were helpful to you, the Hero11 might be where you should stay for now.

However, if you need the new timecode sync, grab the Hero12 Black. That’s a solid feature for those of us who need to sync multiple GoPros at once. I love the Hero12 Black’s Quik QR code syncing feature. The wireless audio recording is a welcome addition as well, but in my testing, the audio didn’t come out as clean as I had wished for. I think using the built-in or a hard-wired mic is still best.

The GoPro Hero12 Black edition currently retails for $349.99, and the Hero12 Black with Max Lens Mod 2.0 currently retails for $429.98.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and Uninterrupted: The Shop. He is also a member of the PGA. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.


Masters of the Air: Directors and DP Talk Shoot, VFX and Grade

By Iain Blair

World War II drama Masters of the Air is a nine-episode Apple TV+ limited series that follows the men of the 100th Bomb Group as they conduct perilous bombing raids over Nazi Germany and grapple with the frigid conditions, the lack of oxygen and the sheer terror of combat at 25,000 feet in the air. Starring Austin Butler and Barry Keoghan, it’s the latest project from Steven Spielberg, Tom Hanks and Gary Goetzman, the producing team behind Band of Brothers and The Pacific.

Anna Boden and Ryan Fleck

Ranging in locations from the fields and villages of southeast England to the harsh deprivations of a German POW camp, Masters of the Air is enormous in both scale and scope. It took many years and an army of creatives to bring it to life — such as directors including Anna Boden and Ryan Fleck and DPs including Jac Fitzgerald.

Here, Boden and Fleck (Captain Marvel) talk about the challenges of shooting, editing and posting the ambitious show. In a sidebar, Fitzgerald (True Detective) talks about integrating the extensive VFX and the DI.

After doing Captain Marvel, I guess you guys could handle anything, but this was still a massive project. What were the main challenges?
Anna Boden: We did episodes 5 and 6. I’d say for us, Episode 5 was a big challenge in terms of wrapping our heads around it all. Some of the prep challenges were very big because it’s really a long air battle sequence that takes up almost the entire episode, and we had limited prep and not a ton of time to do previz and work everything out ahead of time. Also, simultaneously, we were prepping Episode 6, which was going to take us on location and to a whole bunch of new spaces that the show had never been to before. Finding those new locations and doing both of those things at once required so much planning, so it was challenging.

How did you handle the big air battle sequence and working with the volume stage?
Boden: You don’t want to show up on the day and wing it. As filmmakers, sometimes it’s really fun to get on-set and block the sequence based on what the actors want to do. But you can’t do that when you’re shooting on a volume stage, where you’re projecting a lot of imagery on the wall around you. You have to plan out so much of what’s going to be there. That was new for us. Even though we’d worked on Captain Marvel and used greenscreen, we’d never used those big-volume LED stages before. It was a really cool learning experience. We learned a lot on the fly and ultimately had fun crafting a pretty exciting sequence.

I assume director Cary Joji Fukunaga and his DP, Adam Arkapaw, set the template in the first four episodes for the look of the whole show, and then you had to carry that across your episodes.
Boden: Yeah. They’d obviously started shooting before us, and so we were studying their dailies and getting a sense of their camera movements and the color palettes and the vibe for the show. It was really helpful. And our DP, Jac Fitzgerald, knows Adam pretty well, so I think that they had a close working relationship. Also, we were able to visit the set while Cary was shooting to get a sense of the vibe. Once we incorporated that, then we were on our own to do our thing. It’s not like we suddenly changed the entire look of the show, but we had the freedom to put our personalities into it.

And one of the great things about the point where we took over is that Episode 5 is its own little capsule episode. We tried to shoot some of the stuff on the base in a similar tone to how they were shooting it. But then, once we got to that monster mission, it became its own thing, and we shot it in our own way. Then, with Episode 6, we were in completely different spaces. It’s a real break from the previous episodes because it’s the midpoint of the season, we’re away from the base, and there’s a big shift in terms of where the story is going. That gave us a little bit of freedom to very consciously shift how we were going to approach the visual language with Jac. It was an organic way to make that change without it feeling like a weird break in the season.

Give us some sense of how integrating all the post and visual effects worked.
Ryan Fleck: We were using the volume stage, so we did have images, and for the aerial battles, we had stuff for the actors to respond to, but they were not dialed in completely. A lot of that happened after the shooting. In fact, most of it did. (Jac can probably help elaborate on that because she’s still involved with the post process for the whole show.) It wasn’t like Mandalorian levels of dialed-in visual effects, where they were almost finished, and the actors could see. In this show, it was more like the actors were responding to previz, but I think that was hugely helpful.

On Captain Marvel, so often actors are just responding to tennis balls and an AD running around the set for eyelines. In this case, it was nice for the actors to see an actual airplane on fire outside their window for their performances to feel fresh.

Did you do a lot of previz?
Fleck: Yeah, we did a lot for those battle sequences in the air, and we worked closely with visual effects supervisor Stephen Rosenbaum, who was integral in pulling all that stuff together.

What did Jac bring to the mix? You hadn’t worked together before, right?
Fleck: No, and we like her energy. She has experience on big movies and small movies, which we appreciate, and so do we. We like those sensibilities. But I think she just has a nice, calm energy. She likes to have fun when she’s working, and so do we, but she’s also very focused on executing the plan. She’s an organized and creative brain that we really appreciated.

Boden: I think that we had a lot of the same reference points when we first started talking, like The Cold Blue, an amazing documentary with a lot of footage that was taken up in the planes during World War II. Filmmakers actually were shooting up there with the young men who were on missions in these bomber planes. That was a really important reference point for us in terms of determining where the cameras can be mounted inside one of these planes. We tried as much as possible to keep those very real camera positions on the missions so that it felt as reality-based and as visceral as possible and not like a Marvel movie. We used some of the color palette from that documentary as well.

It was also Jac’s working style to go to the set and think about how to block things in the shot list… not that we need to stick to that. Once we get in there and work it through with the actors, we all become very flexible, and she’s very flexible as well. Our work styles are very similar, and we got on really well. We like our sets to be very calm and happy instead of chaotic, and she has a very calm personality on-set. We immediately hired her to shoot our next feature after this show, so we’re big fans.

Was it a really tough shoot?
Boden: Yeah. We started shooting in July and finished in October. That’s pretty long for two episodes, but COVID slowed it all down.

Fleck: I’ve never shot in London or the UK before, but I loved it. I loved the crews; I loved the locations. We got to spend time in Oxford, and I fell in love with the place. I really loved exploring the locations. But yes, there were challenges. I think the most tedious stuff was the aerial sequences because we had mounted cameras, and it was just slow. We like to get momentum and move as quickly as we can when shooting.

Even though this is TV, you guys were involved in post to some degree, yes? 
Ryan Fleck: Yes, we did our director’s cuts, and then Gary kept us involved as the cuts progressed. We were able to get back into the edit room even after we delivered our cuts, and we continued to give our feedback to guide the cuts. Typically, TV directors give over their cuts, and then it’s “Adios.” But because we worked so long on it and we had a good relationship with Gary and the actors, we wanted to see this through to the end. So we stayed involved for much longer than I think is typical for episodic directing.

Typically, on our films, we’re involved in all the other post departments, visual effects and sound, every step of the way. But on this series, we were less involved, although we gave notes. Then Jac did all the grading and the rest of the show. She kind of took over and was very involved. She’ll have a lot of insights into the whole DI process. (See Sidebar)

Anna, I assume you love post, and especially editing, as you edited your first four features.
Boden: I love post because it feels like you’ve made all your compromises, and now all you can do is make it better. Now your only job is to make it the best version of itself. It’s like this puzzle, and you have all the time in the world to do the writing again. I absolutely love editing and the process of putting your writing/editing brain back on. You’re forgetting what happened as a director on-set and rethinking how to shape things.

Give us some idea of how the editing worked. Did you also cut your episodes?
Boden: No, we hired an editor named Spencer Averick, who worked on our director’s cut with us. Every director was able to work on their director’s cut with a specific editor, and then there was Mark Czyzewski, the producer’s editor, who worked on the whole series after that. We worked with him after our director’s cut period. We went back into the room, and he was really awesome. We edited in New York for a couple of weeks on the director’s cut, and then we were editing in LA after that in the Playtone offices in Santa Monica.

What were the big editing challenges for both episodes? Just walk us through it a bit.
Boden: I’d say that one of the biggest challenges, at least in terms of the director’s cut, was finding the rhythm of that Episode 5 mission. When you have a long action sequence like that, the challenge is finding the rhythm so that it has the right pace without feeling like it’s barraging you the whole time. It needs places to breathe and places for emotional and character moments, but it still has to keep moving.

Another challenge is making sure viewers know where they are in every plane and every battle throughout the series. That ends up being a big challenge in the edit. You don’t realize it as much when you’re reading a script, but you realize it a lot when you’re in the edit room.

Then, for Episode 6, it was about connecting the stories because in that episode, we have three main characters — Crosby, Rosenthal and Egan — and they’re in three different places on three very separate journeys, in a way. Egan is in a very dark place, and Rosenthal is in a dark place as well, but he finds himself in this kind of palatial place, trying to have a rest. And then Crosby’s having a much lighter kind of experience with a potential love interest. The intercutting between those stories was challenging, just making sure that the tones were connecting and not colliding with each other, or if they were colliding, colliding in a way that was interesting and intentional.

How hands on were Spielberg and Hanks, or did they let you do your own thing?
Fleck: We mostly interacted with Gary Goetzman, who is Tom Hanks’ partner at Playtone. I think those guys [Spielberg and Hanks] were involved with early days of prep and probably late days of post. But in terms of the day-to-day operations, Gary was really the one that we interacted with the most.

Boden: One of the most wonderful things about working with Gary as a producer — and he really is the producer who oversaw this series — is that he’s worked with so many directors in his career and really loves giving them the freedom and support to do what they do best. He gave us so much trust and support to really make the episodes what we wanted them to be.

Looking back now, how would you sum up the whole experience?
Fleck: All of it was challenging, but I think the biggest challenge for us was shooting during COVID. We kept losing crew members day by day, and it got down to the point where everybody had to test every day and wait for their results. We would have crew members waiting three to four hours before they could join us on-set, so that really cut the amount of shooting time we had every day from 11 hours down to six.

Boden: Some days we’d show up and suddenly find out an hour into the day that we weren’t going to get an actor that we were planning to shoot with, so we’d have to rearrange the day and try to shoot without that actor. That was a big challenge.

Fleck: The great thing for me was how much I learned. Back in history class, you get all the big plot points of World War II, but they don’t tell you about how big these B-17s were, how violent it was up in the air for these guys. You think of the D-Day invasion when you think of the great milestones of World War II, but these aerial battles were unbelievably intense, and they were up there in these tin cans; they were so tight and so cold. I just couldn’t believe that these kids were sent into these situations. It was mind-boggling.

Boden: I also learned a lot through the process of reading the material and the research about the history of these specific people in the stories. But I’d say that one of the things that really sticks with me from the experience was working with this group of actors. That felt very special.

DP Jac Fitzgerald on Shooting Masters of the Air

Jac, integrating all the VFX with visual effects supervisor Stephen Rosenbaum must have been crucial.
Yes. When I started the show, I imagined that the majority of the VFX work would be done on the volume stage. But then I realized that he had a whole World War II airfield to create on location. Obviously, we had the tower structure for the airfield, and we had two planes, one of which was being towed. And it was all so cobbled together from the outside.

Jac Fitzgerald

The planes looked like they were complete, but they weren’t moving by themselves. They didn’t have engines in them or anything. What was interesting to me was the extent of the visual effects that Stephen had to do on the exteriors. We only had two plane bodies, but at any one time when you see the airstrip, there are 12 planes there or more. So there was a huge amount of work for him to do in that exterior world, which was actually as important as the VFX in the volume.

What about the DI? Where did you do all the grading?
It was predominantly in LA at Picture Shop with colorist Steven Bodner, who did the whole show. And because of the enormous amount of VFX, it was obvious early on that things were going to need to be done out of order in the DI.

At first, they thought that my two episodes [5 and 6] would be the first ones to have the DI, as Adam Arkapaw was unavailable to do his episodes [1 through 4] because he was working on another film. At the time they thought they would go in and do my episodes and start prepping and setting the look for episodes 1 through 4 as well. Then it became clear that the DI schedule would have to adjust because of the enormity of the VFX.

Stephen Rosenbaum spent a lot of time making the footage we’d shot and all the VFX worlds collide. I think he had an extraordinary number of people from vendors around the world involved in the project, so there was certainly a lot of cleaning up to do. We all did a lot of work on the look in the DI, trying to make it as seamless as possible. And then again, because episodes 1 through 4 needed so much VFX work, we did my episodes and then we did 7, 8 and 9, and then we went back to 1 through 4. It was certainly a lot of jumping around. I wish that we could have mapped it all from beginning to end, but it wasn’t to be.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Colorfront’s New SDR to-Dolby Vision HDR Conversion Process

At the 2024 HPA Tech Retreat at the end of last month, Colorfront demo’d the Colorfront Engine’s new Dolby Vision conversion capability. The conversion process not only transitions SDR to Dolby Vision HDR but also produces unique Dolby Vision metadata, guaranteeing that the Dolby-derived SDR output visually matches the original SDR content. This round-tripping method presents a unified, streamlined, single-source workflow for mastering and distribution.

The Colorfront Engine now allows users to seamlessly upgrade extensive SDR content libraries to the Dolby Vision HDR format, addressing the surge in HDR-ready displays and devices with a straightforward, time-efficient and cost-effective solution.

“This Dolby-specific version of the Colorfront Engine has been developed to facilitate a seamless conversion from SDR to Dolby Vision HDR with perfect round-tripping,” says Colorfront’s Mark Jaszberenyi. “It’s already shipping and has received feedback from content owners, studios, OTTs and streamers for its ability to maintain fidelity to the original SDR content while offering a premium HDR viewing experience.”

Mark Jaszberenyi

Why is this important for our industry? Jaszberenyi says, “The transition from SDR to HDR aims to enhance visual experiences with improved brightness, contrast and colors. Despite this shift, a significant volume of content and many viewing environments remain SDR-based. The Dolby Vision SDR round-trip solution is vital, as it enables the conversion of original SDR libraries to HDR, incorporating Dolby Vision metadata that aligns with the original content.” He says this process ensures that content is remastered for Dolby Vision HDR viewing while preserving the integrity of the SDR original, all within a single Dolby Vision master file. Importantly, this solution helps content owners and distributors maximize the value of their existing SDR libraries by making them accessible to a wider audience with HDR-capable devices.”

Content owners and distributors can use this solution to produce and deliver content across various devices and viewing conditions. “It facilitates the display of stunning HDR content on HDR-capable devices, ensuring an optimal viewing experience,” according to Jaszberenyi, adding that it also guarantees that the SDR version, derived from HDR content through the Dolby Vision round-tripping process, closely matches the original SDR master.

How does it work from a user perspective? The conversion process balances automation with the option for manual intervention, starting with the transformation of original SDR content into HDR. “This is followed by generating unique Dolby Vision metadata for a seamless SDR conversion,” says Jaszberenyi. “Mastering professionals have the flexibility to fine-tune the Dolby Vision conversion tool based on the specific attributes of the content, ensuring a workflow that not only respects but enhances the creative vision. Importantly, this process is designed to be scalable; it can automatically convert vast amounts of content with ease, whether on-premises or in the cloud, making it a versatile solution for content libraries of any size.”

 

 

 

 

Podcast 12.4
Scott Klein

The Saw Adds Senior Colorist Scott Klein

LA-based creative editorial, color and finishing studio The Saw has added senior colorist Scott Klein. Known for his work on Outer Banks, Empire, Unsolved Mysteries and Bosch, Klein will continue to tackle episodic and long-form narrative assignments while expanding into unscripted content, including awards shows, reality TV and music specials — genres where The Saw has roots. Upcoming projects include Season 2 of Outer Banks.

Klein has compiled more than 100 credits in episodic and long-form television over a career that includes tenures at Warner Bros, Technicolor and, most recently, Light Iron. Highlights include such classics as The Sopranos, Deadwood, The Vampire Diaries, JAG, Empire, The Following, Entourage and Nash Bridges. He won a Monitor Award for his work on the miniseries From the Earth to the Moon and was nominated for an HPA Award for True Blood.

Klein, who uses Blackmagic DaVinci Resolve, says that he is excited to join this new venture and for the opportunity to team up with The Saw founder Bill DeRonde and senior VP of sales Christina Ferreira, whom he has known for years. “I have the utmost respect for Bill and what he has accomplished as a facility owner and craftsman,” he states. “This opportunity came along at the perfect time. What Bill and Christina are building here is brilliant. It’s going to be wonderful for my clients.”

On his success, Klein says that he is always excited to collaborate with filmmakers and enhance their projects through color. “I love the story element. When I’m doing an unsupervised pass, I immerse myself in the story and try to sense its mood and emotions. It’s fun when the director arrives to see how close I’ve come. I enjoy seeing how color complements the editorial, music and visual effects. I love the team aspect of the process.”

 

Podcast 12.4
Maxine Gervais

Senior Colorist Maxine Gervais Joins Harbor in Los Angeles

Harbor in Los Angeles has added senior colorist Maxine Gervais to its team. Gervais brings with her a wealth of experience working on more than 50 feature films over the course of her career.

She has been recognized for her work by the Hollywood Professional Association (HPA), earning two nominations for Outstanding Color Grading – Feature Film: one for her work on the Hughes brothers’ The Book of Eli and the other for Guillermo del Toro’s Pacific Rim. In addition to the nominations, Gervais made Hollywood history as the first woman to receive an HPA award for Outstanding Color Grading for Albert Hughes’ Alpha.

Her credits include Ryan Coogler’s Black Panther, Clint Eastwood’s Cry Macho, Peacock’s The Continental: From the World of John Wick, Netflix’s The Brothers Sun, Universal Pictures’ Strays and Sundance Grand Jury Prize Winner, A Thousand and One directed by A.V. Rockwell.

With a background in classical arts, Gervais earned her bachelor’s in visual art from Laval University in Quebec, demonstrating her passion for creative expression through color and technology. She furthered her education by obtaining a post-graduate certificate in computer technology for cinema and television. Gervais is also an Associate Member of the American Society of Cinematographers.

Upon starting her new role, Gervais said, “Like John Alton’s book ‘Painting with Light’ describes, a colorist paints with colors, shadows and contrast to help define and spotlight the beauty of the captured images by the cinematographer. This is done to enhance the desired mood, feel and direction of the film. I’ve been fortunate to have collaborated with amazing filmmakers for a long time now and am excited to continue doing so at Harbor.”

 

 

Writer/Director Celine Song Talks Post on Oscar-Nominated Past Lives

By Iain Blair

In her directorial film debut, Past Lives, South Korean-born playwright Celine Song has made a romantic and deceptively simple film that is intensely personal and autobiographical yet universal, with its themes of love, loss and what might have been. Past Lives is broken into three parts spanning countries and decades. First we see Nora as a young girl in South Korea, developing an early bond with her best friend, Hae Sung, before moving with her family to Toronto. Then we see Nora in her early 20s as she reconnects virtually with Hae Sung. Finally, more than a decade later, Hae Sung visits Nora, now a married playwright living in New York. It stars Greta Lee, Teo Yoo and John Magaro.

Celine Song directing Greta Lee

I spoke with Song about the post workflow and making the A24 film, which is Oscar-nominated for Best Picture and Best Original Screenplay. It also just won Best Director and Best Feature at the Independent Spirit Awards.

How did you prep to direct your first film? Did you talk to other directors?
I talked to some amazing directors, but what they all said is that because only I know the film that I’m making, the way it’s going to be prepped is a process that only I can really know. You need really strong producers and department heads, which I was so lucky to have. I was able to draw on their experience and advice for every step of the way.

You shot in Seoul and New York. Was it the same sort of experience or was it different going back to Seoul?
The filmmaking culture is very different in both places. In New York, there is a very strong union, and in Korea there isn’t one. Also, the way that you secure locations is different. In New York, if you want to shoot somewhere, the mayor’s office knows about it. Korea is still a little bit like guerrilla filmmaking. You show up to a location and try to get it right. You can’t really get permits for things in Korea.

The story takes place over three separate timeframes. Did you shoot chronologically?
No. We shot everything in New York City, and then we had a set built for the Skype section. Then we went to Korea, prepped it for another month and shot there for 10 days.

You and your, DP Shabier Kirchner, shot 35mm. What led you to that decision?
It was my very first movie, so I didn’t know how hard it was going to be. I don’t have experience shooting on digital or film. I don’t know anything. I think part of it was first-timer bravery. I don’t know enough to be afraid. That’s where the fearlessness came from. But it was also informed by the conversations I was having with my DP. We talked about the story and how the philosophy of shooting on film is connected to the philosophy of the movie, which is that the movie is about time made tangible and time made visible. It just made sense for it to be shot on film.

Celine Song on-set

You come from the theater, where there is obviously no post production. Was that a steep learning curve for you?
Yes, but you do have a preview period in theater, when you see it in front of an audience, and you keep editing in that way. But more importantly, I’m a writer. So part of post is that I don’t think of the movie as just what I see on screen and all the sound design and every piece of it. To me, it is a piece of text. So just as I would edit a piece of my own writing, I feel like I was looking at the editing process very much like editing text.

Then of course in film, it’s not just the writing on the page. It’s also sound, color, visuals, timing… So in that way, I really felt that editing was about composing a piece of music. I think of film as a piece of music, with its own rhythm and its own beat that it has to move through. So in that way, I think that that’s also a part of the work that I would do as a playwright in the theater, create a world that works like a piece of music from beginning to end.

With all that in mind, I honestly felt like I was the most equipped to do post. I had an entire world to learn; I had never done it before. But with post, I was in my domain. The other thing I really love about editing and VFX in film is that you can control a lot. Let’s say there’s a pole in the middle of the theater space. You have to accept that pole. But in film, you can just delete the pole with VFX. It’s amazing.

Did editor Keith Fraase, who is based in New York, come on-set at all in Korea, or did you send him dailies?
We sent dailies. He couldn’t come on-set because of COVID.

What were the biggest editing challenges on this?
I think the film’s not so far from the way I had written it, so the bigger editing choices were already scripted. The harder bits were things that are like shoe leather — the scenes that hold the movie together but are not the center of the emotion or the center of the story.

One example is when Nora is traveling to Montauk, where we know that she’s going to eventually meet Arthur (who becomes her husband). We were dealing with how much time is required and how to convey time so that when we meet Arthur, it seems like it is an organic meeting and not such a jarring one. I had scripted all this shoe-leather stuff that we had shot – every beat of her journey to Montauk. We had a subway beat; we had a bus beat. We had so many pieces of her traveling to Montauk because I was nervous about it, feeling it was not long enough. But then, of course, when we actually got into the edit, we realized we only needed a few pieces. You just realize that again, the rhythm of it dictates that you don’t need all of it.

Where did you do all the sound mix?
We did it at all at Goldcrest in New York.

Are you very involved in that?
You have no idea. I think that’s the only place where I needed more time. We went over budget… that’s a nicer way to say it. That’s the only part of the post process where I really was demanding so much. I was so obsessed with it. The sound designer’s nickname for me was Ms. Dog Ears. I know different directors have very different processes around sound, but for me, I was in that room with my sound designer Jacob Ribicoff for 14 hours a day, five days a week, and sometimes overtime, for weeks. I wouldn’t leave.

I would stay there because I just know that sound is one of those things that holds the film together. Also, with this movie, the sound design of the cities and how different they are and how it’s going to play with the compositions — I had such a specific idea of how I wanted those things to move. Because again, I do think of a film as a piece of music. So I was pretty crazy about it. But I don’t want people to notice the sound design. I want people to be able to feel like they’re actually just standing in Madison Square Park. I want them to be fully immersed.

Obviously, it’s not a big effects movie, but you have some. How did that go?
I think it’s a bit of a subjective thing. Actually, looking at it, I’m like, “Well, does that seem good to you?” I’m showing it to my production designer and my DP and I’m like, “This looks OK to me, but I wonder if it can be better. Would you look at it?” So I relied on many eyes.

I give credit to Keith, but also to my assistant editor, Shannon Fitzpatrick, who was a total genius at catching any problems with VFX and having such a detailed eye. I think she’s one of the only people who really noticed things that I didn’t notice in the VFX. I’m like, I think that looks fine, and then she would say point to this one thing in the corner that’s not working. There are people at A24 who’re also amazing at catching sound and visuals because that’s their job. They’ll point out what sounds strange or what looks strange. So you have so many people who are part of the process.

Who was the colorist, and how involved were you with the grading?
It was Tom Poole at Company 3, which is where we edited and did color and everything. I love the process because I showed up after Shabier and Tom had already gone through the whole film and graded it. They did amazing, beautiful work. Then I would come in and give notes about certain scenes and then we’d do them. Of course, while they were grading it, they’d send me stills, and I’d give notes on the stills before going into the suite. Also, Shabier and Tom have worked together a lot, so they already kind of had a rhythm for how they wanted to color the film.

What sort of film did you set out to make?
Since this was the first film I’d directed, I felt like the main goal was to discover the language of my movie. It was beyond just trying to tell the story the best way I could, from the script stage to the post. I think that was the goal throughout. But the truth is that I really wanted the language of the film to be my own language, and I wanted to learn and have a revelation for myself of what my movie is.

I know it is partly autobiographical. How much of you is in Nora?
It really was inspired by a true event of sitting between my childhood sweetheart, who had come to visit me from Korea, and my husband who I live with in New York City. So this is very autobiographical, and the feeling that I had in that very personal moment is the inspiration for the whole film. But then once you turn it into a script, which is an objectification process, and then you turn it into a film with hundreds of people — and especially with the cast members who have to play the characters — by that time it has become very much an object. Then with post, it’s about the chiseling. It’s about putting together an object that is to be shared with the world.

A film is so different from writing a play. Was it a big adjustment for you?
I know theater because I was in it for a decade, probably more, so I knew the very fundamental difference between the way a play is made versus how a film is made. For example, I was taught that in theater, time and space is figurative, while time and space in film is literal. So that means there are different kinds of strengths and weaknesses in both mediums when it comes to telling a story that spans decades and continents. And, in this case, because my joke is always that the villains of the story are 24 years and the Pacific Ocean, it actually needs the time and space to be seen literally… because there needs to be a reason why these two lovers are not together. So the children have to be literally there, and Korea and New York City have to feel tangible and literal.

I assume you can’t wait to direct again?
Oh, I can’t wait. I want to wake up and just go to set tomorrow. That’s how I feel. I’m trying to shoot another movie as soon as I can.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Review: HP ZBook Fury 16 G10 Mobile Workstation

By Brady Betzel

HP has been at the forefront of computer workstations that target M&E for multiple decades. To keep up with the high-pressure workloads, HP offers enterprise-level workstations with components that will run 24 hours a day, 7 days a week, 365 days a year. And if they don’t, HP will replace the parts and/or system fast — the 24/7/265 uptime is what makes “workstations” unique when compared to off-the-shelf, consumer-grade computer systems.

To ensure the smoothest experience while using apps, HP tests many of today’s pro applications from ISVs (independent software vendors) — from Autodesk to Avid — with its workstations. The HP ZBook Fury 16 G10 is a mobile workstation that combines power and portability without sacrificing either.

The HP ZBook Fury 16 G10 that I was sent to review includes the following specs:

  • CPU: Intel Core i9-13950HX (up to 5.5 GHz with Intel Turbo Boost technology, 36MB L3 cache, 24 cores, 32 threads)
  • Nvidia pro-grade graphics: RTX 5000 Ada GPU
  • Display: 16-inch DreamColor QHD (3840×2400), WUXGA (1920 x 1200), IPS, anti-glare, 400 nits, 100% sRGB
  • RAM: 64 GB RAM – two DIMMs at 5600MHz DDR5 (four total DIMM slots)
  • Storage: 1TB SSD

In the latest HP ZBook Fury 16 G10, there are quite a few updates. Besides speed/hardware improvements, the most interesting updates include the full-size RGB keyboard with 10 keys. I am a sucker for a 10-key. When I was trying to pay for my own car as a teenager, I worked at Best Buy fixing computers and eventually installing car stereos. One of the things I learned from that job was getting fast at using a 10-key number pad. You know how that helped me in editing? Timecode input. So I love that HP includes the 10-key pad even on a mobile workstation.

The next impressive feature is the RGB backlit keyboard. Sure, you can use it just to show off some fancy rainbow effects, but you can also tie the RGB lights to specific applications, like Adobe’s Premiere Pro and After Effects. To adjust the RGB colors, you need to open an inconveniently titled app called Z Light Space. I would have preferred for HP to have called the app “HP RGB Keyboard” or something easily searchable, but what can you do? The keyboard is fully customizable and comes preloaded with apps like Premiere and After Effects. The default Premiere layout has keys such as “j, k and l” labeled in a nice teal color.

Physically, the HP ZBook Fury 16 G10 is thick. The keyboard feels like it sits an inch above the desk. Even so, it isn’t uncomfortable. The dimensions are 14.29 inches by 9.87 inches by 1.13 inches, and it weighs just over 5lbs. The power supply is large and kind of cumbersome, although it delivers a hefty 230W. I really wish workstation laptops would come with streamlined power supplies… maybe one day. HP includes a one-year parts/labor warranty (not on-site unless you pay extra).

Around the outside of the workstation, there are a lot of useful ports:

  • Right side:
    • one RJ-45
    • one headphone/microphone combo
    • one SuperSpeed USB Type-A 5Gbps signaling rate (charging)
    • one SuperSpeed USB Type-A 5Gbps signaling rate

  • Left side:
    • one power connector
    • two Thunderbolt 4 with USB4 Type-C 40Gbps signaling rate (USB Power Delivery, DisplayPort 1.4, HP Sleep and Charge)
    • one HDMI 2.1
    • one Mini DisplayPort 1.4a

Now on to really what matters… Does the HP ZBook Fury 16 G10 really chew through media in Blackmagic Resolve and Premiere Pro? Yes, it does, and when it is running hard, the fans turn on. The Nvidia RTX A5000 laptop GPU is really impressive considering that it’s stuffed inside such a small form factor. Resolve continually embraces GPU acceleration more than Adobe, in my opinion, and the results of my testing bear that out.

Blackmagic Resolve
Up first is Resolve 18.6.4. Keep in mind that when comparing workstations or GPUs, increased speeds are not always tied to new hardware. Advancements in underlying software efficiency, drivers, firmware updates, etc. will also improve speeds. That said, based on a UHD, 3840×2160 timeline, I edited the following clips together and put a basic color grade on them:

  • ARRI RAW: 3840×2160 24fps – 7 seconds, 12 frames
  • ARRI RAW: 4448×1856 24fps – 7 seconds, 12 frames
  • BMD RAW: 6144×3456 24fps – 15 seconds
  • Red RAW: 6144×3072 23.976fps – 7 seconds, 12 frames
  • Red RAW: 6144×3160 23.976fps – 7 seconds, 12 frames
  • Sony a7siii: 3840×2160 23.976fps – 15 seconds

I then duplicated that timeline but added Blackmagic’s noise reduction. Then I duplicated the timeline again and added sharpening and grain. Finally, I replaced the built-in Resolve noise reduction with a third-party noise reduction plugin from Neat Video. From there, I exported multiple versions: DNxHR 444 10-bit OP1a MXF, DNxHR 444 10-bit MOV, H.264 MP4, H.265 MP4, AV1 MP4 (Nvidia GPUs only) and then an IMF package using the default settings.

Here are my results:

HP ZBook Fury 16 G10

 

DNxHR 444 10-bit MXF DNxHR 444 10-bit MOV H.264 MP4 H.265 MP4 AV1

MP4

IMF
Color Correction Only 00:53 00:48 00:31 00:30 00:33 01:19
CC + Resolve Noise Reduction 02:13 02:13 02:02 02:02 02:02 02:19
CC, Resolve NR, Sharpening, Grain 02:57 02:56 02:48 02:48 02:48 02:58
CC + Neat Video Noise Reduction 03:59 03:59 03:47 03:48 03:51 04:03

Adobe Premiere Pro
I ran similar tests inside Premiere Pro 2024 (24.1), exporting using Adobe Media Encoder. The video assets are the same as the ones I used in Resolve, but I used Adobe’s noise reduction, sharpening and grain filters instead of Resolve’s and Neat Video.

Here are the Premiere Pro Results:

HP ZBook Fury 16 G10

Adobe Premiere Pro 2024 (Individual Exports in Media Encoder)

DNxHR 444 10-bit MXF DNxHR 444 10-bit MOV H.264 MP4 H.265 MP4
Color Correction Only 01:27 01:26 00:45 00:48
CC + NR, Sharpening, Grain 25:47 57:17 46:46 59:21
HP ZBook Fury 16 G10

Premiere Pro 2024 (Simultaneous Exports in Media Encoder)

Color Correction Only 02:15 03:47 03:22 03:22
CC + NR, Sharpening, Grain 30:52 01:08:16 01:03:30 01:03:30

These results are definitely competitive with desktop-size workstations. What makes laptop-size components difficult to design? Heat dissipation and size. HP labels its heat dissipation technology as Vaporforce Thermals. That’s a fancy way of saying that HP takes pride in how it designs its fans and heat spreaders to keep the system as cool as possible, even when rendering hours of content in multimedia apps like Resolve.

HP does a great job at keeping the HP ZBook Fury 16 G10 cool to the touch, which isn’t always the case for workstations. Also, the tool-less design of the HP ZBook Fury 16 G10 is amazing. With one switch, you can remove the bottom panel and begin diagnosing, replacing or upgrading components with little technical know-how. The ease of disassembly is what keeps me loving HP’s workstations. The quickest way to put a bad taste in my mouth is not to allow, or make it extremely difficult to, self-repair or upgrade. It just feels wrong. But luckily HP makes it easy.

With such an impressively powerful mobile workstation comes a large price tag: the HP ZBook Fury 16 G10 I tested retails for just over $9,000 before taxes and shipping. Yikes. But for the power under the hood of the HP ZBook Fury 16 G10, you are essentially getting desktop power in a small form factor. The battery that comes with the Fury is great, I turned off any power saving settings to ensure I was running at full speed, and I was able to get about 2.5 hours of run time while running the PugetBench for Creators benchmark utility on a loop. That is essentially constant video editing and rendering.

While that runtime might seem short, it is actually pretty long when running at full speed. But obviously, staying plugged in is your best option when doing multimedia work. If security is important to you, and we know it is, then HP’s Wolf Security is loaded with protections. You can find out more here.

Summing Up
In the end, the HP ZBook Fury 16 G10 is a pricey but powerful mobile workstation that won’t leave you wishing for a desktop. Add a little docking setup with a couple monitors, and you’ll be flying through your color correction in Resolve, noise reduction with Neat Video or video editing in Premiere Pro.

Honestly, the backlit RGB keyboard seemed like a novelty at first, but I found that I really enjoyed it. Definitely check out the MIL-STD 810H-tested HP ZBook Fury 16 G10 if you are in the market for the highest of high-end mobile workstations, which can play RAW 4K media with little interruption:


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and Uninterrupted: The Shop. He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Colorist Chat: Company 3 Colorist Yoomin Lee

Yoomin Lee is a colorist at Company 3 London, a global company providing post services across feature films, episodic television, commercials, gaming and more. “The freedom to work on all sorts of projects, big or small, is a major thing that attracts talent to work for them,” Lee says.

We reached out to Lee to find out more about how she works and what inspires her.

As a colorist, what would surprise people the most about what falls under that title?
Most people think I’m a hairdresser when I say I’m a colorist, so it surprises them that my job exists.

Are you sometimes asked to do more than just color on projects?
Alongside color, sometimes I’m asked to do some compositing. Beauty work, cleaning and giving some texture to images are also common in color grading. Our tools have become so powerful that they allow us to do so much with them.

Yoomin Lee

Louis Vuitton Speakers

What are some recent projects you’ve worked on?
The Louis Vuitton Speakers and Earphones films for Jacob Sutton, Anton Corbijn’s feature film Squaring the Circle, L’Occitane’s global campaign “Art of Gifting” and the 2023 Waitrose Christmas campaign directed by Autumn De Wilde.

How do you prefer to work with the DP/director?
Ideally, I like to work with them in person, as it allows us to build relationships and interact in real time. However, as times have changed, remote working has become more popular and is another tool in our arsenal to collaborate with filmmakers.

How do you prefer the DP or director to describe the look they want? Physical examples, film to emulate, etc.?
Rather than verbal recommendations, visual references help me understand the base of what the client is looking for. From there, we can work together to build out the piece of work.

After working together on several projects, it becomes easier over time as you understand their vision and become familiar with their taste.

Do you have any suggestions for getting the most out of a project from a color perspective?
It’s always helpful when clients are clear about what they want; however, I think leaving some room to explore in color-grading sessions is good. Sometimes, we get a shot-by-shot reference from clients, which could limit what you can achieve because it’s hard to judge if that is the best approach until you see something different or better.

Do you provide LUTs for on-set?
Sometimes, if requested, but we tend to be more generic rather than extreme so that it works for most shots.

How does your process change when working on a film versus episodics or commercials?
With long-form, I tend to spend more time creating the basic overall look rather than finessing shot by shot too much, whereas with commercials, we have more time to finesse the details.

What system do you work on?
I use both FilmLight Baselight and Blackmagic Resolve.

What’s your favorite part of color grading?
That would be creating beautiful images, taking an ordinary picture and bringing life to it. No two days are the same.

Why did you choose this profession?
One of the main reasons is that I love bringing life to images. However, I also like that it’s technical as well as creative. It’s such a satisfying thing to see the transformation of the projects after color grading.

I only found out the role of a colorist existed when I started working as a junior in a post company, and I have been fascinated since then.

What would you be doing if you didn‘t have this job?
I would likely have become an architect. Shapes and forms of buildings tell stories, and architecture can be visually pleasing. There are synergies with being a colorist.

U2 “Joshua Tree”

What is the project that you are most proud of?
“The Joshua Tree” 30th anniversary concert visuals for U2. It was a joy to participate in this project and to see the visuals live on the 200-foot-wide screen behind the legendary U2 in 39 cities worldwide.

Where do you find inspiration? Art? Photography? Instagram?
Everywhere! Anything visual has been my inspiration: photography, paintings and films. Over the years, since I became a colorist, I have observed more how light changes at different times of the day all over the world.

Is there a film or show that sticks out to you as an example of amazing color?
There are so many, but most recently, I saw Poor Things, directed by Yorgos Lanthimos and color-graded by my colleague at Company 3, Greg Fisher, and it looked amazing.

Can you name some technology you can’t live without?
Like many others, my phone allows me to take snapshots of inspiration and look through different social media channels to find inspiration. Still, I’m pretty good at spending only a little time on Instagram and social media.

What do you do to de-stress from it all?
I don’t have a television at home, mainly because I’m scared to see things I graded look differently on a domestic monitor. Still, I’m in front of a monitor all day long, so while I’m at home, I’m trying to avoid the environment that surrounds me at work. I’m trying to live an analog life as much as I can.

Getting the Right Look for Oscar-Nominated Anatomy of a Fall

Securing the Palme d’Or at the Cannes Film Festival and clinching five Oscar nominations, Anatomy of a Fall is a gripping family saga unraveling the startling collapse of an ordinary household. Under the helm of Justine Triet, her fourth directorial venture paints a dizzying portrayal of a woman accused of her husband’s murder, set amidst a suffocating ambiance. Graded at M141, colorist Magali Léonard from Chroma Shapers shares her workflow on this film, discussing both the artistic and technical details.

“Justine and director of photography Simon Beaufils reached out to me early on, even before the filming commenced, during the camera trials. I had previously collaborated on the grade for Justine’s Sibyl, a project where Simon also served as the lensman. This marked my second project with Justine and sixth with Simon,” says Léonard.

The director and DP worked closely with Léonard, who worked on Blackmagic DaVinci Resolve Studio, throughout the entire post process, making sure the film’s feel translated to the screen.

“Justine envisioned a raw, contrasting narrative embracing imperfections and flaws, aiming to create something visceral and sensual,” explains Léonard. “This vision particularly manifested in the trial sequences, characterized by flushed skin tones, sweat and tangible fatigue.

“I translated that vision alongside Simon’s directives into the visuals, meticulously attending to facial expressions and skin tones,” she continues. “We closely collaborated in crafting a visual identity, starting with extensive camera trials during preproduction involving hair, makeup and costumes.”

During the initial phases, Beaufils conducted tests on 2-perf 35mm film, allowing Léonard to emulate the film’s appearance when calibrating the digital camera tests. “This served as the cornerstone to unearth the film’s ambiance and visual identity,” she says.

Triet and Beaufils opted for a large-format camera paired with Hawk V lite anamorphic lenses, despite the film’s aspect ratio of 1.85. “The anamorphic lenses infused a richness of colors, flares and distinct blurs, softening the digital sharpness of the sensor. Simon was a pleasure to collaborate with, crafting exquisite imagery encapsulating intricate emotions,” she adds.

“My approach to the visuals was iterative, manipulating contrast through DaVinci Resolve’s custom curves, followed by adjustments in colors, saturation, and highlights. Subsequently, I introduced grain to impart a more pronounced aesthetic, a process initiated from the rushes onwards, laying the groundwork for the film’s overarching mood,” Léonard shares.

Refinement and Collaborative Efforts
In the later stages of the digital intermediate process, Léonard revisited the nodes used to establish the visual identity for fine-tuning. “I ventured into more daring suggestions, striving to refine highlights and specular lights while infusing subtle diffusion. For instance, we enhanced the saturation in the blues while preserving the rawness inherent in the set design and costumes,” she elaborates.

For the courtroom sequences, the grade underwent an evolution mirroring the unfolding of the trial toward a denser, golden atmosphere. “It was crucial to accentuate the actors’ facial expressions while retaining the initial appearance of a slightly rugged and textured visual, a tangible and vibrant material,” says Léonard. “I embraced the notion of allowing the visuals to unfold their utmost potential as the narrative progresses.

“Throughout the grading process, we frequented the Max Linder Cinema to screen the film under theatrical conditions, gaining insights into the visuals and enabling me to make finer adjustments to the final look. For instance, through these screenings, we discerned that certain scenes would benefit from heightened saturation or contrast,” she concludes.

Poor Things‘ Oscar-Nominated Cinematographer and Editor

By Iain Blair

Lavish, audacious and visually stunning, Yorgos Lanthimos’ Poor Things tells the fantastical story of Bella Baxter (Emma Stone), a young woman brought back to life by the brilliant, daring scientist Dr. Godwin Baxter (Willem Dafoe). The film just won five BAFTAs and scored an impressive 11 Oscar nominations, including nods for cinematographer Robbie Ryan, BSC, ISC, and editor Yorgos Mavropsaridis, ACE, who were both previously nominated for their work on Lanthimos’ The Favourite.

I spoke with Ryan and Mavropsaridis (aka “Blackfish”) about making the film and collaborating with Lanthimos, who also got an Oscar nom.

Robbie Ryan

You’ve both worked with Yorgos Lanthimos before. Was this process very different, or was it pretty much the same way he always works?
Robbie Ryan: It was my second time working with Yorgos, and I felt like the approach was similar. But I was a bit more tuned in to his thinking process, which is quite loose from a filming perspective. He likes to get the things he needs in place and then elaborate or experiment, maybe search for something new. We’re not too set or stringent in our approach. It’s pretty loose.

What about you, Blackfish?
Yorgos “Blackfish” Mavropsaridis: We have been working together for almost 25 years. I know his approach, and I know that during the assembly I need to put things in order according to the script. That’s not the main work… it’s just for me to understand the material. Then, when he comes back, we start looking at the sequences and trying things.

Yorgos “Blackfish” Mavropsaridis

I’d say it was an easier process for Poor Things in the sense that I know him so well. And we had to focus on a specific character, which gave us the path to follow. Of course, having said that, films are not easy in that sense. We always try to deconstruct the script to take it off the paper and make it more interesting. We also involve the viewer in different ways than a classical Hollywood film does.

Robbie, is it true you shot this whole thing in a studio in Budapest?
Ryan: Yes, that’s correct. We shot 35mm celluloid and used a bunch of stocks. We shot some old VistaVision as well, which is a lovely format, and black-and-white and Ektachrome.

What were the main challenges of shooting a film like this, where nearly everything was constructed? I assume you were quite involved with set design and the like, which is unusual?
Ryan: Yes, we basically did 12 weeks of prep on this film. That was so we could build the sets and watch their design. The production designers, Shona Heath and James Price, built five or six big sets and used Unreal Engine to create them in 3D. Yorgos [Lanthimos] and I would look at what they were building in a 3D world. It was amazing to walk onto that set a few weeks later and see it for real. It looked exactly like those sets built in 3D. That was the world we prepared, and it was amazing.

So you were able to previz it all like that?
Ryan: Exactly, as Unreal Engine is previz in a way. It’s a 3D program that lets you look around at every angle. For instance, for the ship, you could really look at every corner, and if one corridor on the ship was a little bit too skinny, we could make it a bit bigger so we could fit a dolly on it. There was lots of that sort of preparation, and it helped a lot.

How did that affect your lighting approach?
Ryan: It took a little bit more work from a lighting perspective. I had to light it a bit more because we were indoors and in studios. We took the same approach that we did on The Favourite, which was not to use lights on the set. We lit the studio sets with a sky outside all the buildings, and that gave me confidence that I was going in the right direction because that is the way Yorgos [Lanthimos] likes to work. We just had to create something that resembled a real environment in an interior space.

Blackfish, were you on the set at all?
Blackfish: No, I was in Athens from the beginning. The only time I went on-set was for The Lobster, and it wasn’t a pleasant experience for me. I prefer to see it objectively as it comes in rather than go to the set and get influenced by the atmosphere, the actors and all these things. So I stayed home in Athens, and they sent me dailies. I’d get the negatives and black-and-white the next day, but the Ektachrome took a couple of days more since it had to go to Andec, a lab in Berlin, and then to Athens. But it was a really fast process. Yorgos and I don’t talk at all when he’s on-set, or very rarely.

Robbie, I know you’ve worked with colorist Greg Fisher at Company 3 before. How early on did you start working with him on LUTs and the look for this?
Ryan: We didn’t do that actually, and that was a bit of a mistake. We had a dailies grader in Budapest, which was driving Yorgos [Lanthimos] a bit mad. Film doesn’t need so much grading now. But Greg did a lot of work with us from the very beginning, helping us out with early tests about nine months before we shot. But then, when we went to Hungary, we used a Hungarian lab and a Berlin lab. They were doing the dailies for us there, and it wasn’t quite what Yorgos [Lanthimos] was expecting, so he got a little bit frustrated by it.

The bottom line is that we should have done a show list, but we didn’t know that’s what was meant to be done, so we kind of learned the hard way. The film still looked nice in the rushes, but it just wasn’t quite what we thought it would be. When we went into the final grade with Greg at the beginning of last year, we spent three weeks grading the film. He’s got quite a thorough process. We went through every sequence one by one and didn’t review the whole film until we got through all the sequences. We spent a week and a half going through everything, then we watched the whole film back, and then we went deeper again. Yorgos likes to go quite deep into the color grading. Just recently, we made a 35mm print of the film. That was an interesting coda to the whole grade process, and it was quite a lot of work as well.

Robbie, how long was the shoot, and what was the most difficult scene to shoot and why?
Ryan: We shot for about 50 days. The scene when Bella comes out of the hotel and walks around Lisbon was pretty difficult because it was a big lighting kind of environment. The set was great, and it was amazing to walk around, but it was difficult to photograph, so I think we struggled a bit on that.

Blackfish, walk us through the editing process when you sit down together with Yorgos.
Blackfish: Two weeks after shooting finished, I had an assembly ready, but it was so long there was no point watching it as a whole. We just went through sequences and then refined the scenes exactly as they are in the script order. We took care of the actors’ performances — which one Yorgos liked best and how the emotions were interpreted in each scene. When we have a good assembly or first cut, then we start experimenting, somehow deconstructing what we have done, discussing, “What if we start with this scene, not the other one, and then what does that give us for the next scene?”

Then there are points where the exposition takes many scenes to develop. For example, there’s a dinner scene with Max, Godwin and Bella. In the assembly, the scenes appear in linear order in the continuity of time and space, so we found ways in the edit to go to previous scenes and then cut them in, or go to later scenes to create a sequence. We developed this method of intermixing scenes and making them a sequence on Dogtooth and have been using it ever since — and very interesting ideas arise. For example, you can say the same thing in the scene — or say it even more forcefully with a thought — if it’s combined with dialogue from another scene. Of course, that’s quite difficult.

Sometimes you have to go through a lot of edits to find it, to refine it, but in the end, we get it to where we want it. We try to get around problem areas and keep the phrases or the moments that we need and then cut them with other things to pick up the pace. That editing technique also creates internal combustion. It provides momentum so the viewer doesn’t get ahead of us. We sometimes need to surprise viewers, and it has to do with how we think or how we want the viewer to feel or think at that moment. So it’s a whole procedure.

How long was the edit in the end?
Blackfish: It took about eight months.

What was the most difficult scene for you to get just right?
Blackfish: Technically, it was the dinner-and-dance scene. The actors had done a lot of movements. The camera was moving all the time, and there were also some static shots. The difficulty was to keep the eyeline correct in the 180-degree space so as not to lose the audience. It was difficult to find the best performance moments. All the other things, of course they’re difficult, but it’s different to edit a difficult scene like a dance. It’s also more fun and more satisfying.

Did you use a lot of temp sound?
Blackfish: Not at all. The music was done much earlier than the filming, and composer Jerskin Fendrix had written the theme of the film. Of course, it was not the final music, but we had the same thing played differently or with a single instrument or with a big orchestra, and we had a lot of options to try. We could cut the music if we wanted to speed it up, or at other times we could edit the film according to the music. So having the music gives you a lot of good opportunities. As for the sound design, my assistant always uses external sounds. We need to have that for me and Yorgos to see how it works, to make sure there are no gaps or anything. We cut on Avids with Nexis storage, and we had about 12 terabytes.

Robbie, I assume you had to coordinate with visual effects on-set, as there’s quite a lot of VFX.
Ryan: Yes, we had an on-set supervisor from Union Effects who did all the VFX [and picked up a BAFTA for their work], and he would let us do what we wanted. For instance, Yorgos [Lanthimos] didn’t want to use greenscreen, so when we were filming the hybrid animals, even though the VFX guy liked the idea of using greenscreen, we didn’t do it because we could just rotoscope it. We shot it twice, one animal first, second animal second, and then that was comped in together.

Yorgos was trying to do it with older cinema technology, like backdrops and moving-image backdrops, and we had LED walls as a backdrop and painted backdrops. I think that was really a nice way to do it because the actors felt like they were a little bit more immersed and didn’t have to worry about getting it right for VFX, which sometimes happens.

That was a really nice atmosphere to work in. Yorgos has such a knowledge of cinematography and what you can do VFX-wise; he was confident that he would get it in post, and they did, indeed, get it in post. They went through quite a long process of trying to perfect all that, there were quite a few incarnations, and it was a very VFX-heavy job, but they got there in the end. Union did a great job.

How would you each sum up the experience?
Ryan: It’s been a long journey, and it was never in any way boring. It’s always been fun. Yorgos likes a fun film set to work in. He doesn’t like to have any sort of tension at all, so we have a crew around us that are very relaxed, and I really enjoyed it. We worked hard, and sometimes it didn’t go right, but we always found a way, and it was a really exciting film to work on.

Blackfish: It’s new all the time and interesting working with Yorgos. We’ve almost finished editing the last film he and Robbie shot in New Orleans, and I guess he’s planning the next one in May. So I’m going to continue the experience.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Puget Systems Debuts Custom Laptops and SDS Storage

Puget Systems has expanded its product offerings beyond custom desktop workstations into the mobile computing market with the introduction of an entirely new category of custom mobile workstations.

Debuting at this year’s HPA Tech Retreat in Palm Springs, the new Puget Mobile 17-inch will feature high-performance hardware with Intel’s Core i9 14900HX CPU and Nvidia’s GeForce RTX 4090 mobile GPU, all built into a notebook chassis. The 17.3-inch QHD screen has a 240Hz refresh rate and high color accuracy. This combination of high-performance components makes the Puget Mobile 17-inch a  good solution for content creators who demand performance, reliability, quality and ultra-smooth workflows in a mobile form factor.

According to Puget Systems, this move signals the expansion of its strategy to provide broader, more comprehensive solutions for its users’ workflow and performance requirements as they continually seek more flexible, reliable and powerful systems. Based on customer feedback Puget is looking to partner with companies its users trust for white-glove service, support and industry-specific expertise.

Throughout the early development process of the new Puget Mobile 17-inch, the Puget Labs and R&D teams worked closely with select users from multiple industries to collect feedback and ensure they were on track.

“This laptop is about as close as you can get to the performance of a PC tower while actually having something that still works as a laptop,” reports Niko Pueringer, the co-founder of Corridor Digital, who has been using Puget computers for years. “And it provided all the qualities I’d expect out of a Puget system. Oh, and I also like that it’s not loaded up with promotional bloatware…

“There are a lot of machines out there with high specs. Anyone (with enough money) can buy a 4090 and sling it in a case,” continues Pueringer. “What makes Puget special is that all the supporting pieces get the attention they deserve. With Puget, I know that I don’t have any hidden compromises or bottlenecks. All my USB ports will work at the same time. The heat management is capable of handling 100% loads for extended time. I know that all the pipes between the shiny GPUs and CPUs are big and beefy and ready to handle anything I throw at it. This laptop was no exception.”

The Puget Mobile 17-inch custom laptops will be available for configuration for a wide range of applications beginning in Q2.

Embracing Storage, MAM and Archiving
At HPA, Puget has also debuted a new family of custom software-defined storage (SDS) solutions. The new Puget Storage solution— in partnership with OSNexus — uses OSNexus’ QuantaStor platform to provide scalable and agile media asset storage for both on-site and remote users.

Available in a 12-bay and a 24-bay 2U form factor, Puget Storage solutions are capable of up to 1.5TB of RAM and provide growing and established studios with simple, flexible storage with end-to-end security. These scalable, agile media asset storage solutions are ideal for post workflows, media asset management applications and archival services with stringent requirements for the ideal combination of capacity, performance, security and scalability.

Partnering with OSNexus to integrate its QuantaStor platform provides Puget Storage users with a number of key benefits, including:

  • Storage grid technology: Grid technology unifies management of QuantaStor systems across racks, sites and clouds.
  • Security: Advanced RBAC, end-to-end encryption support, complies with NIST 800-53, 800-171, HIPAA, CJIS, and is FIPS 140-2 L1 certified
  • Hardware integration: QuantaStor is integrated with a broad range of systems and storage expansion units, including Seagate, Supermicro and Puget Systems rackmount storage platforms for media and entertainment.
  • Scalable: Integrated with enterprise-grade open storage technologies (Ceph and ZFS)
  • Unified file, block and object: All major storage protocols are supported, including NFS/SMB, iSCSI/FC/NVMeoF and S3.

The new Puget Storage SDS solutions will be available for configuration for a wide range of applications beginning in Q2.

 

 

 

Behind the Title: Treatment TD and Post Pro Brandon Kraemer

Brandon Kraemer is a technical director at Treatment, which is a creative content design studio that specializes in live music and large scale multi-media experiences.

While his primary role is technical director on larger projects like U2:UV at MSG Sphere, he is also part of the creative team and works in the disciplines of animation, editorial and color.

What does your job as TD entail?
For most projects, being the technical director means understanding all the technical details of the project and overseeing how the tech dovetails with the creative intent.

What would surprise people the most about what falls under that title?
It’s everything technical that we might have direct control over and everything else that we don’t. There are so many moving parts, both big and small, and it’s crucial that we’re aware and able to communicate about all of them. This includes hardware, network administration, LED processing, Disguise media server specification and endless software variables. It also is about the logistical and human side to all this tech, which covers a wide range of circumstances that require diplomacy and tact.

Can you name some recent projects you have worked on?
The latest project Treatment has completed is U2:UV at the Las Vegas Sphere. This was a monumental undertaking that occupied the last 12 months of my life. I’m very grateful to U2’s creative director, Willie Wiliams, for the opportunity, and I’m extremely proud of the work that producer Lizzie Pocock and the Treatment team delivered. There is no other show like it.

Treatment also produced the content for Weekends with Adele, which my friend and colleague Matt Askem directs. That was quite a technical challenge as well and is one of the most beautiful shows I’ve been a part of.

How has your section of the industry changed since COVID? The good and the bad?
There has been a huge amount of growth in virtual production, which I’ve been fortunate to be involved with, and there is a lot of development from that sector that has benefited other parts of the live entertainment industry.

There may have been a painful break from live shows, but the innovation certainly didn’t pause.

Do you see some of these workflow changes remaining with us going forward?
Absolutely. I think what is no longer valuable from the pandemic era has already been left by the wayside, and the better workflows and tools developed in that era are with us and improving every day.

If you didn’t have this job, what would you be doing instead?
I work in a lot of different disciplines. I’m also a colorist with an Academy Award-nominated film on my resume. I’ve been the director of post production and senior editor for a post studio [called Lightborne]. I see my career as an evolving journey. If I wasn’t doing this right now, I’d be doing something that runs parallel to it.

Why did you choose this profession?
The right tools and the right opportunities came along, and I really applied myself. I wanted to do something artistic, so I just kept looking for challenges and kept saying yes to most opportunities, even if they were indirectly creative. I’ve ended up blending my creative side with my technical side, and it’s a path that seems to be undeniably “me.”

What’s your favorite part of the job?
At heart, I am a creative problem-solver, and I love it when I’m able to chart a course for the wider team that proves to be the right path.

What is your least favorite?
Repeating work.

Name some technology you can’t live without.
Sadly, I’d say my iPhone. I think we’re all in that boat.

I don’t think we’d be able to bring big visions like U2:UV to life without tools like Disguise. Disguise’s d3 media server platform is what Treatment trusts, and without that toolset, we would have been creatively hamstrung.

Last, something fun… my Hammond organ and vintage Leslie 145 speaker! Life isn’t all work, after all.

Do you listen to music while you work? Care to share your favorite music to work to?
Yes, I do sometimes, and it varies with what task is at hand. Usually it’s ambient pieces by composers like Harold Budd, Brian Eno or Nils Frahm if I need to think strategically and focus. If I am in the creative execution mode, where I need energy and focus, then it’s something more intense, like early prog-rock Genesis or math-rock artists like Battles.

What do you do to de-stress from it all, aside from your Hammond?
Music and hiking. I’m a multi-instrumentalist, and music is a mentally stimulating activity with immediate feedback. Hiking… because it’s all about turning off the mind and experiencing nature in real time.

Both are really important to me, and each has its own space.

Would you have done anything different along your path?
I’m incredibly fortunate to have had the opportunities I’ve had. While the journey hasn’t always been easy, I have no reason to think retrospectively about changing things.

Finally, any tips for others who are just starting out?
My advice is to listen, work hard and be prepared to take risks. Surprise yourself and be a positive influence no matter what your role.