NBCUni 9.5.23

Category Archives: dailies

DP Linus Sandgren on Saltburn’s Shoot, Dailies and Color

By Iain Blair

Swedish cinematographer Linus Sandgren, ASC, has multiple award noms and wins under his belt, including an Oscar for his work on the retro-glamorous musical La La Land. His new film, Saltburn, couldn’t be more different.

Written and directed by Emerald Fennell, Saltburn is a dark, psychosexual thriller about desire, obsession and murder. It follows Oxford University student Oliver Quick (Barry Keoghan) who finds himself drawn into the world of the charming and aristocratic Felix Catton (Jacob Elordi), who invites him to Saltburn, his family’s sprawling estate, for a summer never to be forgotten.

Linus Sandgren

I spoke with Sandgren, whose credits include First Man, Babylon and American Hustle, about making the film and his workflow.

What was the appeal of doing this film?
It was two things. The script was brilliant; it was very suspenseful and exciting. I was drawn in by the buildup, how Emerald had it constructed, and I couldn’t stop reading. It was also very exciting for me because I hadn’t really done this type of film before. It was a unique story with a unique approach to this sort of psychopathic character — how you feel an affection for him, a sort of sympathy. It’s also so dark and funny.

I was also excited to talk to Emerald because of her work on Promising Young Woman, which I loved. Her directing of that film was excellent, and she was making very bold decisions. Then we had a call, and I was very impressed by her. She’s just so brilliant when she explains her vision, and you’re really drawn into her storytelling.

Tell us a bit about how you collaborated on finding the right look.
I typically don’t find the look [based on] different films. It’s more abstract than that, and a good approach is to just talk about it and see what words come up. Emerald said things like, “Desire or unachievable desire. Beauty and ugliness. Love and hate.” Suddenly you get images in your head, and one was of vampires. The family are like vampires, and Oliver is obviously a vampire who loves them so much he just wants to creep inside their skin and become them.

So there was some sort of metaphorical layer I was attracted to, and Emerald had a lot of vision already in terms of visual references — from Hitchcock movies about voyeurism to silent horror movies and Caravaggio paintings. We grounded it in some sort of gothic vampire core, but the story couldn’t just start there. We had to fool the audience a little bit and not explain that right away but have imagery that could be in that vein. The language was basically that the days could be sunny and bright and romantic, while the nights would be dangerous and dark and sexy. It was these discussions we had early on that inspired the lighting style and the compositions.

Tell us more about the composition.
Emerald wanted it to feel like the house was a dollhouse that we could peek into, and she wanted it to have a square format. It all made sense to me with that in mind, as well as the voyeuristic approach, where you focus on one singular thing more than if you go scope. It feels like you can see much more that way, so that allowed us to do things in a more painterly style. As soon as we started shooting that way, we knew we were right using an aspect ratio of 1.33×1 because we felt that we could be more expressive.

So compositions were a little bit as if you’re watching an oil painting, a classic type of composition, and we’d block the scenes within a frame like that without really cutting, or we’d go in really tight on something. It was sort of that “play with it a little bit” thing. Also, the approach is slightly artful more than cinematic. I feel like we thought of the shot list in another way here. It would be more, “How can we tell this story in a single shot, and do we need another shot, and if so, what is that?” Probably that’s just a really tight close-up. So we had a slightly different way of blocking the scenes compared to what I’ve done before. It was about creating that language, and the more you nail it before you shoot, then it solves itself while you start working on each scene.

What camera setup did you use, and what lenses?
We shot Super 35mm film in a 1.33×1 aspect ratio, which is the silent aspect ratio. We used Panavision Panaflex Millenium XL2s. It’s the same as silent movies, basically, for perf, and we used Panavision Primo prime lenses.

Did you work with your usual colorist Matt Wallach in prep?
Yes, the team was Matt Wallach (Company 3 LA) and dailies colorist Doychin Margoevski (Company 3 London). The dailies software was Colorfront’s On-Set Dailies. I have worked with Matt on dailies for many movies and lately in the DI. We set this up together, but he wasn’t able to come over to London to do the dailies, so he was involved remotely and was watching stills from the dailies Doychin did.

Tell us about your workflow and how it impacts your work on the shoot.
My workflow is always that the film gets scanned, in this case at Cinelab in London, and then developed and scanned in 4K. So it’s a final scan from the beginning, and we don’t touch the negative again. Then it goes to Company 3 for dailies. But before the dailies are distributed, the colorist sends me stills from his grading suite in dailies so I can look at the color. It’s just a few stills from the different scenes, and takes a week or two for us to dial it in. Matt gets the footage; he uses his instinct, and we apply a Kodak print emulation LUT. Then he works with the printer lights to see where he has the footage, and he does what he feels is right, with perhaps contrast or lower blacks.

He then tells me what he did, and we look at it on the stills he sends me. That’s when I’ll say go a little colder or darker or brighter or whatever. But usually after a few days we dial it in and get the look down. But, as I said, we spend a little more time in the beginning to make sure we have it right, and it also has to do with me knowing that we’re doing the right thing with the lighting — perhaps I’ll need to add more light for the next scene.

This has been our way of working since Joy in 2015, which was the first thing Matt and I worked on together with dailies. That process is really good because nowadays the iPad is like P3 color space, and it looks really good when you have it at a certain exposure, and that becomes our look. That’s why it’s so important to set that in the dailies because once we’re in the DI, I don’t want to change it. I just want to adjust things, like match the shots to each other or fix a face or do something else without changing this sort of look. The look should be there already.

That’s what I like about film too; it adds something to it. I feel like I know exactly how it’s going to look, but it looks 5% better or different with film because it gives me things that pressure me when I see it. It’s like, oh, look at the halation there, or look at those blue shadows. There’s something always going on that’s hard to actually imagine, as you don’t see it with your eyes, even if you know it’s going to be there. So that’s a nice thing. Basically, if you looked at the dailies on any of my previous films, I didn’t touch it much. That’s why I usually like having the same colorist do the dailies as the DI, but it couldn’t be helped on this one.

Dailies colorist Doychin Margoevski was great. He’s also got a great eye for darkness, and he’s not afraid of letting it be dark. So as I noted, the three of us dialed it in together initially, and then he sent stills to me and Matt, and we looked at them. That way, Matt was very familiar with the footage when we came to the DI, and he’s used to being with a timer and keeping track on the whole project. Matt also did the trailers, so all that is solid control.

I heard that you shot all the stately home interiors on location at just one house?
Yes, it was a 47-day shoot, all done in the one country house and in a nearby country estate for some exteriors, like the bridge scene. Otherwise, all the exteriors and interiors are at the same house. Then we shot at Oxford and near Oxford for some interiors, and then London. We built only one set, which was the bathroom. That was built inside of a room, and the two rooms next to it were Oliver’s and Felix’s bedrooms. They were completely painted and dressed and made up as their rooms, as they didn’t look that way at all when we came in. It’s the red corridor that was important going into the bathroom, and then the bathroom and then the rooms.

 

I assume the huge maze was mostly all VFX?
Yes, the whole maze is visual effects combined with the practical. When we’re down there walking around, it’s all practical, and we had these hedge walls that were moved around so we could get through. The center of the maze with that big statue in the middle was built by production designer Suzie Davies and her team. It was all VFX for the big, wide exterior overhead maze shot and the wide shot from the windows. VFX supervisor Dillan Nicholls and Union did all the effects.

What was the most difficult scene to shoot and why?
That’s a good question. I think the scenes of Oliver’s party. We had to be careful with the property, so we couldn’t drive around too many condors or cherry pickers, and we had to shoot different scenes over a few nights all over the place — from one end of the house to another end of the garden. We would be inside of the maze and outside at the discotheque or inside at the red staircase. And all of that had to be prelit to work 360, basically.

It was daunting to light, but we could eventually position lights and condors and sneak them in from other angles. So it was a little complicated. We had to plan it out, but thanks to the really good special effects department, we could fog it all up. Suzie Davies helped with fire flames so we could send practical lights in there to make it all look like a big party.

Are you happy with the way it turned out?
Yes, I’m really proud of it. It’s a special film for sure, and it was a really fun shoot… and different. It’s so refreshing to have a director that dares to do what you think is right, just the way you want to, so you don’t have to restrict yourself. I love working with Emerald. She’s very fun and, I think, brilliant.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Scoop

FutureWorks Uses Next-Gen Color Workflow on Netflix’s Scoop

Mumbai, India’s FutureWorks created a new pipeline for Netflix India courtroom drama Scoop, taking imaging data all the way from the set to the edit suite. As well as creating a new workflow that boosts efficiency while ensuring quality visuals, FutureWorks also covered the entire picture post process and rental services on Scoop – including dallies, the online, grade, VFX and finish.

Produced by Matchbox Shots, the Hindi-language series was directed by Hansal Mehta, with Pratham Mehta serving as director of photography and Michele Ricossa as lead colorist. The show follows a prominent crime journalist implicated in the murder of a rival reporter.

ScoopFutureWorks began to develop this workflow following its live color grading work on the 2022 film Jersey. Based on the challenges experienced on Jersey and taking advantage of the lull in productions during the pandemic, the studio started to evolve its DIT process to enable the team to work more efficiently. This is particularly important on-set. The aim was to empower the colorist to work with the DP and director while they’re still on set, so that any issues could be flagged before reaching the edit suite.

“We needed a process that would support everybody,” says Rahul Purav, head of color at FutureWorks. “So, we started to think about extending the imaging process beyond just color. We focused on creating an on-set monitoring process, as well as QC.”

“With Rahul, we managed the workflow of the DIT setup prior to the shooting,” explains Ricossa. “I was on set the first two or three days of the shoot to check with the DP and the DIT team on how to work on the dailies and if the look was working as intended. After that, I had a short session at FutureWorks to review some of the footage on the HDR setup. A few weeks later I went on-set one more time to a different location to check if everything was holding up properly look wise. Then, I reviewed the dailies on the private cloud streaming service, giving minor notes to the DP and to the DIT team. Everything went smoothly.”

Ricossa graded Scoop using FilmLight’s colorspace T-log/E-gamut. Only VFX shots were converted in ACES to facilitate the VFX workflow. “The look of the show, and the grade of individual clips later on, was shared with the VFX team thanks to the Baselight’s BLG system. When possible, we had a few back and forths between the DI and the VFX to fix issues and get the best out of it,” explains Ricossa. “The most challenging part of the show was to match some of the stock footage. Baselight’s tools and color management system helped a lot to achieve the grade I had in mind.”

In FutureWorks’ workflow, everything is done remotely. During shooting, the systems record and monitor video signals wirelessly. This means that there’s no distraction or interruption for the cinematographer, but when required, they can talk to FutureWorks’ on-set DIT who has a studio-grade monitor under their control. This helps the directorial team to verify that everything they’re shooting is correct, while they’re still on-set. This includes color, but also extends to other areas like lenses, focus, and exposure levels.

Transcoding is monitored throughout the process to highlight any areas of concern. If there are issues, such as reflections or unwanted props in the shot, these can be dealt with at the time of shooting or flagged for fixing by the VFX team. Everything is captured as an Movie file with embedded metadata so that all of the data from the shoot ends up with editorial. “It’s like there’s a third eye watching you and helping you while you’re shooting and editing,” says Rahul. “As a colorist, I think it’s imperative that everybody in the chain is aware of what’s happening on the shoot, right from the beginning to the very end. This makes communication much more efficient, as notes from the cinematographer can be embedded into the metadata of the particular shot, which is very helpful later on in the process.”

This new process was absolutely key for the shoot on Scoop, which lasted for 100 days. A team of four people tested the system first, with FutureWorks having since streamlined the crew to three — one experienced DIT technician for on-set QC and another two for data management. All team members are very experienced and have trained for a long time so that they can integrate with each other on the shoot, ensuring that all the necessary data is captured and transcoded.

“When you take that experience on location, it’s an asset to the cinematographer, the director, and the production as a whole,” explains Purav. “Throughout the shoot on Scoop, the director and cinematographer continually came over to verify shots on the imaging cart, demonstrating that our new pipeline is already proving to be useful for the directorial team.”

The revamped pipeline — which had to meet the specifications required by Netflix productions — includes Livegrade Studio, Codex and Silverstack for transcoding, and FilmLight Daylight for rendering dailies. One of the key challenges in implementing the new workflow was understanding the protocols of each camera. If certain protocols didn’t work with the new system, the team had to find different ways to sync the data. FutureWorks also collaborated closely with manufacturers and vendors, including Sony and Codex, to troubleshoot any problems.

“While we had a few teething problems initially, we were able to work them out within a couple of days, and it was smooth sailing from that point on,” says Purav.

Since its successful debut on Scoop, FutureWorks has rolled out the new imaging process on several other projects.

 

NBCUni 9.5.23

Molinare Adds Brands and Services, Forms Molinare Creative Group

UK-based post facility Molinare has formally become Molinare Creative Group. In line with Molinare’s expansion and recent diversification into games and advertising, Molinare Creative Group has been formed to streamline Molinare’s business lines and provide an end-to-end solution to the post industry.

As part of the new structure, Molinare Creative Group has acquired digital dailies and digital imaging technology company Notorious DIT, which will operate as a separate division of the group. Founded by Michael Pentney, who will remain as the company’s managing director, Notorious DIT’s credits include The Gentlemen, Peaky Blinders and Heartstopper.

Additionally, Molinare has launched two brands, Sound Warriors and Voice Molinare, that provide external audio services for the games, advertising, TV and film communities. Sound Warriors provides Foley, sound editorial, field records and re-recording mixing services for games and for film and TV clients outside of those undertaking full post at the facility.

The Sound Warriors team is made up of audio engineers and artists whose credits include the AAA games God of War: Ragnarok and Ghost of Tsushima as well as films Fantastic Beasts: The Secrets of Dumbledore and Netflix’s The House. Led by chief creative officer/sound supervisor Glen Gathard, the company is currently carrying out field records in Finland for an animated feature film as well as car records for an AAA game in development.

Voice Molinare (Voice) streamlines the audio recording services under one brand, providing ADR, voiceover, dialogue direction, head-mounted capture (HMC), performance capture, crowd records and casting across the games, film, TV and advertising industries.

Voice’s HMC and performance capture services are supported by industry partner Centroid out of the purpose-built studio in Molinare’s HQ on Foubert’s Place, Soho. Sound Warriors and Voice will both be located between Molinare’s primary premises on Foubert’s Place and its Poland Street building a short walk away.

Molinare TV & Film will continue to focus on delivering post services for the film and scripted and unscripted TV community, including dedicated offline facilities, picture and audio finishing, QC, mastering and delivery.

Finally, Molinare is officially bringing affiliated audio post facility Pip Studios into the group, providing clients with localization mixing and services for the film, TV and gaming communities. It was founded in 2020 by Ally Curran, Mark Sheffield and Molinare CEO Nigel Bennett, who is currently chairman of Pip Studios. Sheffield will retain his role as managing director alongside Curran, who remains operations director.


NAB 2023: Colorfront Demos 8K/HDR Streaming Features

During NAB, Colorfront was showing its Transkoder and streaming and dailies systems at the Conrad Hotel during NAB. The company was also presenting “live” streaming of 8K footage using only hotel broadband in a setup that features Colorfront Transkoder running in AWS. The solution was reading, debayering and processing 8K Sony Venice 2 and Red RAW camera footage directly from AWS S3 cloud object storage and streaming 8K video to Colorfront Streaming Player. The Streaming Player is running on a Mac Mini with an HDMI 2.1 video connection, displaying the results in HDR on an 8K LG 77-inch OLED TV.

Streaming
Colorfront Streaming Player is video receiver software for critical remote viewing, with support for professional video I/O devices from AJA and Blackmagic Design. Streaming Player has undergone significant improvements, including the introduction of a shared pointer feature with minimal latency for real-time collaboration and a new browser-based viewer option for optimized user experience.

Additionally, there are two streaming server products available: the compact, 1RU Colorfront Streaming Server appliance and the software-only Colorfront Streaming Server Mini for Mac/PC laptops and workstations. The Advanced Streaming Gateway and Stream Manager ensure secure, predictable streaming to multiple destinations. With streaming built into all Colorfront products, the company’s On-Set Dailies, Transkoder and QC Player applications can all stream in 8K, whether locally or from the cloud.

Colorfront streaming products have been audited for security and certified by several major Hollywood studios to securely stream prerelease content via AES 256-bit encryption, enterprise single sign-on user authentication, with the ability to kick out individual viewers. Security features also include forensic audio/video watermarking, visible spoilers and burn-ins of individual IP addressed and session IDs.

Along with 8K/HDR capability, the latest enhancements include AWS Cloud Digital Interface (CDI) input for AWS cloud workflows; an NDI interface that allows creative digital artists to live-stream material directly from popular NLE, compositing and color grading systems without needing additional hardware; built-in Dolby Vision content-mapping with 4.0 and 5.1 metadata; and a host of new tools to manage invitations and assign users to streams.

Colorfront Streaming Player was recently deployed in The Culver Theater, the first US theater to showcase a Samsung 8K HDR Micro LED IWA wall.

“We are thrilled to now have the ability to play out 8K camera RAW files to our 8K HDR cinema screens directly from our Virtual Private Cloud in AWS,” says Jonathon Lee, head of media engineering and innovation at Amazon Studios. “This has created a new paradigm for us and helps close the loop with our global, pure-cloud production environment. We can now post our shows anywhere in the world in full fidelity.”

At NAB 2023, Colorfront is showing its Transkoder mastering, QC and deliverables solution running on a Mac Studio desktop with dual 32-inch Retina 6K Pro Display XDR monitors and featuring Colorfront’s new second-head Analyzer. In this powerful, cost-effective configuration, Colorfront Transkoder optimizes DCP/IMF mastering workflows, accelerates HEVC H.264/H.265 and ProRes read/write, and delivers rapid handling of all new RAW camera formats.

The newest version of Transkoder also supports Dolby Vision 5.1, with a focus on metadata creation and trims for cinema targets as well as Dolby Vision validation. Additional capabilities include Dolby L1 check, side-by-side split view for SDR and HDR, and additional markers for 48-nit checks.

Colorfront is highlighting a host of additional productivity and user-experience enhancements to Transkoder, such as support for the latest DCP/IMF packages, high-throughput JPEG 2000 (HTJ2K), the OpenTimelineIO interchange format, an API for editorial cut information, new playhead controls, improved image analysis and HDR reporting, and auto-detection of blanking issues.

On the audio side, Transkoder supports Dolby Atmos and comes with new dialogue-gated loudness measurement and PDF reporting tools, greater abilities to manipulate individual channels in multi-channel audio tracks, as well as timeline editing and enhancements.

Transkoder integrates automatic speech detection, enabling AI to decipher spoken content and extract the corresponding text. This allows users to overlay text as subtitles or export it for other purposes, ultimately streamlining the workflow. For cloud-based workflows, Transkoder provides ultra-low-latency uncompressed video output via AWS CDI as well as NDI.

Colorfront On-Set Dailies and Express Dailies
Colorfront’s On-Set Dailies and Express Dailies are both updated to support the latest digital camera formats, including ARRI Alexa 35, Red V-Raptor 8K VV, Sony Venice 2 8.6K and Blackmagic Design 12K RAW. It also supports ACES 1.3, the latest Academy Color Encoding System, including ACES Metadata Files (AMF).

Support for third-party OpenFX has been extended to embrace Sapphire VFX plugins, which, along with Invizigrain and LiveGrain plugins, enable users to add textural qualities to content. There are also new tools for handling/retiming high-frame-rate clips, improved rendering and enhanced custom burn-in.


Image processing

Colorfront Adds Connectivity for Transkoder, Dailies Tools

Colorfront was showing new connectivity and image-processing capabilities across its Express Dailies, On-Set Dailies and Transkoder lines at IBC 2022. These include integration into NDI and IMF component-based media workflows, support for AWS cloud storage and digital camera/audio formats, plus timeline editorial/user-interface tools.

Express Dailies, On-Set Dailies and Transkoder all feature 16-bit NDI video output. Network Device Interface (NDI) is a royalty-free software specification, developed by NewTek, enabling video-compatible products to deliver/receive low-latency, frame-accurate video via Local Area Network.

For cloud-based operations, all products offer faster import/load of media on AWS S3 cloud storage. Colorfront also continues working with Adobe company Frame.io to integrate its Dallies and Transkoder products into Frame.io’s Camera to Cloud initiative, including improved connectivity and new support for OAuth 2 latest secure authentication.

For IMF component-based media workflows, validation tools for detecting incorrect Dolby Vision metadata, audio loudness problems and inconsistent letterboxing/frame-line configurations will provide information during the packaging or QC process. By reviewing auto-generated markers created for specific issue types, users can catch problems with the audio, image and metadata content of the composition. Additionally, when checking localized or supplemental VF CPLs, QC operators can isolate inserts from assets that have gone through the QC process and compare them to the base version.

Colorfront systems feature support for ARRI Alexa 35 and Red V-Raptor XL 8K VV cameras, with Mac Studio M1 Ultra-based Express Dailies and On-Set Dailies systems performing with Apple Metal optimization. Other upgrades include handling of camera lens metadata and High-Throughput JPEG 2000 (HTJ2K) for IMF. Color pipeline capabilities are powered by a Colorfront Engine plus full support for ACES viewing pipelines via ACES 1.3, the latest Academy Color Encoding System and interoperability with third party application with ACES Metadata File (AMF).

Colorfront also showed a host of user-enhancements across its product line, including timeline improvements for multi-channel audio editing, spatial audio workflow, hot-key editor and usability improvements — each designed for preparation and assembly of assets. Support for third-party OpenFX enables users to add textural qualities to content using Invizgrain and LiveGrain plugins.

Additionally, Transkoder now has the export capabilities of Dolby Vision HEVC (Profile 5 and 8.1) using Dolby Encoding Engine, together with Dolby Digital Plus with Dolby Atmos content as deliverable options for HEVC encoders, in addition to quality assurance tools for Dolby Atmos spatial audio technology.


Colorfront Boosts Live-Streaming Capabilities for Remote Post

Colorfront — a developer of high-performance dailies/transcoding systems for motion pictures, OTT, broadcast and commercials — has boosted its live-streaming capabilities for remote collaboration between creative digital artists and their filmmaking partners. This includes adding new features to its Streaming Server hardware appliance and Streaming Server Mini and Streaming Player software solutions, which live-stream secure, reference-quality pictures and audio with subsecond latency for remote collaborative review. Colorfront has also launched the Advanced Streaming Gateway for secure, predictable live-streaming traffic to multiple destinations when using Colorfront systems.

Colorfront demo’d the new features and capabilities at NAB 2022.

Colorfront Streaming Server
The Streaming Server is a 1-RU server appliance that uses Colorfront Engine to provide color-managed pipelines and color integrity on SDR/HDR materials. The Streaming Server can simultaneously stream up to four channels of 4K 4:4:4 or 4:2:0, 256-bit AES-encrypted, reference-quality video (including Dolby Vision and high-frame-rate 3D Stereo) plus up to 16 channels of 24-bit AAC or uncompressed PCM audio to remote clients anywhere in the world over readily available public internet.

Colorfront Streaming Server Mini
The Streaming Server Mini is a new software-only product for Mac/PC workstations that digital artists can use to encode and live-stream frame-/color-accurate, reference-quality work in progress over the public internet to multiple production stakeholders anywhere in the world. Artists can do this directly from their NLE, compositing and color grading systems, such as Adobe Premiere, Apple Final Cut, Blackmagic Resolve, Autodesk Flame and Foundry Nuke.

Streaming Server Mini is easily installed on either PC or Mac host workstations with AJA or Blackmagic video I/O. It works in real time with HD/2K content in SDR/HDR and supports Dolby Vision and multi-channel audio. The system uses SRT (Secure Reliable Transport protocol) and 256-bit AES encryption to ensure content remains protected. Colorfront Streaming Server Mini software is available for download now from Colorfront’s website.

Colorfront Streaming Player
Streaming Player is Colorfront’s video receiver software for critical remote viewing and QC. It decodes the encrypted HEVC live stream from Streaming Server and Streaming Server Mini (as well as Colorfront’s Transkoder and Dailies products) and supports color-accurate, professional video output at target luminance levels on different HDR-capable consumer and prosumer displays and professional monitors — from smartphones, notebooks, tablets and Apple TV 4K to Dolby Cinema and LED cinema displays. Security includes Touch ID and Face ID biometric authentication, enterprise-level single sign-on and multi-factor authentication.

Streaming Player is available as an app from the Apple App Store for iPhone, iPad and Apple TV. It also runs on Windows and Mac platforms, for which a range of monitoring options are available: AJA’s T-Tap Pro allows users to connect to prosumer OLEDs and review in Dolby Vision, PQ, Hybrid Log Gamma and HDR10; Blackmagic’s UltraStudio 4K Mini enables 4K HDR output via both SDI and HDMI; and Apple Mac Studio, M1 Mac Mini and MacBook Pro allow connection to an Apple XDR Pro display directly via Thunderbolt 3 or to a prosumer OLED monitor via the HDMI output.

Colorfront Advanced Streaming Gateway
To optimize the streaming experience, Colorfront has developed Advanced Streaming Gateway, a private, user-owned/operated, high-performance SRT streaming gateway available on AWS cloud or in a data center of choice. It supports secure, robust and predictable streaming to many destinations simultaneously when using Colorfront systems or third-party SRT streaming servers or clients. Along with advanced security and frame- and color-accurate reliability, it provides usage reporting and service telemetry and fully integrates with the Colorfront Broker global scheduling and management service.

 


Colourlab Ai 2.0

Colourlab Ai 2.0 Adds Premiere, FCP Integration, New UI

Color Intelligence has released the public beta version of Colourlab Ai 2.0, with new features and an entirely new user interface created with the goal of making the color-grading process faster, easier and more accessible. In addition to the existing integration with DaVinci Resolve, Colourlab Ai v2.0 features support for both Adobe Premiere and Apple Final Cut Pro. There are also new subscription pricing options, including Colourlab Ai Pro and the more affordable Colourlab Ai Creator.

Colourlab Ai 2.0

In addition to the new key features, there are many general improvements, including optimizations to the Cinematic Neural Engine and color-matching functionality as well as overall speed and performance improvements that build on capabilities of Apple’s M1X processor.

“Our mission has always been to make the color-grading process more creative and accessible for everyone. Legacy color-grading tools are complex — requiring significant experience and skill to get Hollywood-quality results,” explains CEO Dado Valentic, who co-founded the company with Mark L. Pederson and Steve Bayes. “Artificial intelligence enables creators to spend more time on the creative aspects of their craft. With this 2.0 release, we have focused on making an app that empowers creators — regardless of their level of experience with color.”

New Features:
–    Timeline Intelligence
Dynamically sort shots based on their similar image characteristics using AI-based analysis from the Cinematic Neural Engine. This enables creators to work exponentially faster by “auto-sorting” shots together that require the same color grade.
–      Adobe Premiere and Apple Final Cut Pro Integration
Seamlessly round-trip timelines directly from Apple and Adobe editing systems. Users can now non-destructively color grade with Colourlab.ai even while they are still editing. Combined with traditional controls like Printer Lights and Lift Gamma Gain, Premiere and Final Cut Pro users can now benefit from a full color-managed pipeline.

–      Ai Powered Auto Color
Leverage the AI-based image analysis of Colourlab Ai’s cinematic neural network to facilitate a “one-click” color adjustment with consistent and superior results across your entire project. This feature can reduce the work of a dailies colorist to a single click and save hours on projects with extreme shooting ratios like reality-based content.

–      Color Tune
This intuitive toolset gives users with visual grading options that will enable them to more quickly fine-tune color. Color adjustments are simple, fast and easy.

–      Show Look Library
Colourlab Ai 2 ships with a new Show Look Library. Show Looks combine 3D LUTs with the parametric metadata of Color Intelligence’s Look Designer technology, allowing users to easily modify and create variations and leverage the film stock emulations included in Look Designer. You can also import any 3D LUTs, and they will be converted into a Show Look that can be further edited in Look Designer.
–      Smart LUTs
Colourlab Ai 2 introduces Smart LUTs, which answer the questions: What if a color “preset” or “filter” was intelligent? What if it was parametric could be adjusted and changed? What if it was content-aware? Beyond a 3D LUT, Smart LUTs contain parametric values that can be adjusted in Look Designer. They also contain a unique content fingerprint created from reference frames for use with Colourlab Ai’s cinematic neural network. Colourlab Ai 2 comes preloaded with a selection of stunning Smart LUTs to get users started. Users can also use Look Designer to modify these Smart LUTs in Look Designer and can even create a Smart LUT from any still image reference.

–      Improved Camera Matching
Instantly color-matched shots from different cameras in a full color-managed pipeline. When working with multicam projects, Colourlab Ai’s built-in camera profiles combined with its AI-based color matching will save hours of manual image-tweaking.

Pricing and Availability
With the Version 2.0 release, Colourlab Ai is now offered at two subscription levels:

Colourlab Creator includes full Ai functionality – subscription for $129 per year

Colourlab Pro includes Look Designer, Tangent device control, SDI video output, and DaVinci Resolve integration – subscription for $39 per month, $99 per quarter, $299 per year or $599 for a perpetual license.

Both versions come with a fully functional seven-day trial. While it is currently available only for macOS and fully optimized for Mac computers with Apple silicon, a Windows version is coming later this year.

Podcast 12.4
Post Studio

Harbor to Open New Post Studio in London

Harbor has broken ground on a new post production studio that will be spread across three floors in the Turnmill building in London’s Farringdon. The studio will offer end-to-end post — dailies, offline editorial, picture finishing, sound post, ADR and screening theaters. At the same time, Harbor will continue to offer dailies and screening services from its Windsor locations to support clients based at the studios west of London.

The original building was a warehouse that became the iconic Turnmills nightclub venue during the 1990s and early 2000s and gained a reputation as a hub for modern dance music culture. The new studio is set to open this summer.

Harbor’s UK facilities have offered dailies to productions including The Midnight Sky, Venom: Let There Be Carnage, The Great Season 2, The Northman and Invasion.

“When we started our dailies services in the UK, we already had a roadmap to reach this point, and despite the many challenges presented in the last two years, our commitment as a business to the UK creative industries has never wavered,” says Harbor commercial director James Corless. “We have always believed that Harbor could bring something unique to London’s post production offerings, and I can’t wait for the clients to see what we have planned at Turnmills.”

“We have always wanted to offer artists and filmmakers the opportunity to collaborate across disciplines under one roof – and now that roof spans geographies,” eplains Harbor founder/CEO Zak Tucker. “We’ve seen it already with our current footprint — dailies in London, color finishing in North America — where we offered our clients a streamlined workflow for color grading and sound post from set to screen.”

Podcast 12.4

Autodesk Acquires Moxion’s Cloud Platform for Dailies

Autodesk has acquired Moxion, the New Zealand-based developer of a cloud-based platform for digital dailies. The solution has been used on such productions as The Midnight Sky, The Marvelous Mrs. Maisel and The Matrix Resurrections. According to Autodesk, the acquisition of Moxion’s talent and technology will expand Autodesk’s own cloud platform for media and entertainment, “moving it beyond post into production, bringing new users to Autodesk while helping better integrate processes across the entire content production chain.”

Moxion’s platform enables professionals to collaborate and review camera footage on-set and remotely with the immediacy required to make creative decisions during principal photography in 4K HDR quality and with studio-grade security. Moxion ensures data security with features like MPAA compliance, multi-factor authentication, visible and invisible forensic watermarking and full digital rights management.

Founded in 2015, Moxion has been awarded with an Engineering Excellence Award from the Hollywood Professional Association (HPA), a Workflow Systems Medal from the Society of Motion Picture and Television Engineers (SMPTE) and a Lumiere Award from the Advanced Imaging Society.

“As the content demand continues to boom with pressure on creators to do more for less, this acquisition helps us facilitate broader collaboration and communication and drive greater efficiencies in the production process, saving time and money,” says Diana Colella, SVP Media and Entertainment, Autodesk. “Moxion accelerates our vision for production in the cloud, building on our recent acquisition of Tangent Labs.”

Aaron Morton, a cinematographer who has worked on projects including Orphan Black, Black Mirror, American Gods and Amazon’s new The Lord of the Rings series, used Moxion for several projects. “It’s never fun when decisions are being formed about your work if the dailies aren’t the way you wanted them to look,” explains Morton, NZCS. “With Moxion, it’s what I see on the set, and the decisions I make with the dailies colorist always play out so that production people and producers are seeing what I want them to see. The images are very true to what we see while we’re shooting.”

 

dailies

Picture Shop Opens Dailies Facility at UK’s Pinewood

Picture Shop has officially opened its doors on a new dailies facility at West London’s Pinewood Studios. The facility offers complete dailies services with four grading rooms equipped with Colorfront Express Dailies systems. The studio-approved lab securely processes and delivers protected media and content with high-speed connections for remote workflows. Streamland Media’s Pulse technology gives filmmakers a comprehensive cloud-based content management and collaboration system.

The new lab is being led by Picture Shop head of front-end services (UK) Chris Barrios, who previously led dailies operations for Picture Shop in Los Angeles. Barrios brings more than 20 years of post production experience to his new role, from dailies processing and color to engineering and facility management.

“We’re thrilled to bring the global reach of Picture Shop directly to filmmakers on the legendary Pinewood lot,” says Picture Shop president Cara Sheppard. “Under the leadership of Chris Barrios, having Picture Shop based at Pinewood gives filmmakers access to our A-list talent and technology with unparalleled service at a best-in-class facility.”

Picture Shop’s nearly 4,000-square-foot space at Pinewood can run up to 10 projects concurrently under studio-level security protocols. The facility will be securely connected to Picture Shop locations around the globe, offering access to the company’s talent pool.

Production, operations and support teams are based at the facility, offering 24/7 client assistance. Maintaining the focus on talent and client services, Picture Shop at Pinewood offers a roster of colorists from around the world, and support from set to screen.

 

 

Light Iron

Light Iron Up Leadership Team: New Hires, Promotions 

Light Iron, a Panavision company that provides post creative services, has bolstered its executive leadership team with recent promotions and hires.

In an evolution of her role with the company’s leadership team, Light Iron cofounder Katie Fellion has been promoted to SVP, business development and post production strategy. She has been with the company since it began operations in 2009 and was instrumental in establishing Light Iron’s Outpost mobile dailies systems. In her time with Light Iron, Fellion has also produced several firsts in file-based finishing, including the first 6K DI, the first studio feature cut on Final Cut Pro X and Amazon’s first HDR series. In her new role, she is responsible for global sales and business development, new market strategies, and strategic alignment with Panavision so clients can maximize the value proposition of the companies’ shared production-to-post offerings.

In addition, Megan Marquis has been promoted to VP of operations for Light Iron Los Angeles. She joined Light Iron in 2013 as a senior producer at the company’s New York location where she handled budgeting and oversaw projects from dailies through final delivery. After Light Iron’s acquisition by Panavision, she began to integrate resources between the two companies, assisting cinematographers in camera tests and joining conversations about production choices to smoothly translate decisions to the post side. With her latest promotion, Marquis is responsible for managing the overall operations of Light Iron’s two facilities in Los Angeles.

With more than 20 years of experience developing file-based workflows, best practices and future roadmaps for digital motion imaging, Eric Camp joins the company as director of operations for Light Iron New York. Camp brings wide-ranging expertise to the position and firsthand understanding of the client’s point of view, having worked in both production and post in a variety of creative and operational roles, including crew positions on 15 features and over 250 television episodes. With Light Iron, Camp oversees all of the New York facility’s operations, including the location’s on-site and remote offline editorial rental offerings.

Ken Lebre joins Light Iron as director of dailies, bringing experience in streamlining workflows with creative and technical talent. Through his background with boutique shops and larger post facilities — where he’s held positions including director of operations, senior director of client services and GM — he’s become an expert in dailies workflows while simultaneously interfacing with studios and production companies. At Light Iron, Lebre oversees the company’s global dailies operations, including at its regional facilities and its primary hubs in Los Angeles and New York, and for all near-set and remote deployments domestically and internationally.

In addition to Light Iron’s New York and Los Angeles facilities, each of which offers the full breadth of creative finishing services, the company also has locations in Atlanta, Albuquerque, Chicago, New Orleans, Toronto and Vancouver, offering dailies services and remote sessions. Light Iron’s unparalleled remote capabilities — including solutions for dailies, offline editorial rentals, and color and finishing — open the doors for filmmakers working anywhere in the world to partner with the company.

 

 

Marvel's WandaVision

Color and Post Pipeline: The Many Looks of Marvel’s WandaVision

By Randi Altman

If you’ve watched Marvel’s WandaVision, you can imagine that the production and post workflows were challenging, to say the least. The series, streaming on Disney+, features multiple time periods, black-and-white footage, color footage, lots of complex visual effects and the list goes on.

Colorist Matt Watson

Colorist Matt Watson and the in-house Marvel post team had their hands full and creative hats on. To help with any challenges that might come up, Watson got involved from the very beginning — during camera tests — and that, along with being on set, proved to be invaluable.

“I’ve set up several Marvel shows with a DI toolset and mindset,” reports Watson. “Having that resource on set at the start of a production really helps to find answers to color pipelines and workflow direction, and WandaVision had many challenges.”

WandaVision was a complex show to set up from both a color and pipeline perspective, so we reached out to Watson to find out how he and the Marvel team prepared and tackled the show.

Can you talk about the many post challenges on WandaVision?
Not only was this a show with looks from different eras and with multiple aspect ratios, but it was also Marvel’s first “HDR first” show — meaning HDR monitoring on set, in dailies and finishing — so it was really important to nail down a pipeline and workflow for all departments.

The Brady Bunch look

Of course, the success of this kind of undertaking comes down to the communication. Camera, DIT, dailies, editorial, Marvel’s plates lab, VFX and finishing… everyone had particular wants and needs, so being there and having these conversations was incredibly important. Communication is one of those ingredients that aid the success of Marvel productions.

Post supervisor Jen Bergman was great at getting all the teams together and talking through all potential problems. Evan Jacobs, Marvel’s creative finishing supervisor, was there during the setup to offer his vast experience with Marvel productions, as was Mike Maloney, Marvel’s imaging guru. The knowledge pool available for the show setup was inspiring.

How did you work with WandaVision DP Jess Hall (ASC, BCC)?
I spent time with Jess working on dialing in the LUTs for use. We had seven different looks in total, which had to be converted to both HDR and SDR. For me, being there in person really helps streamline and evolve this initial creative starting point, as I can see exactly what Jess was lighting on set and what his intentions were.

The Brady Bunch look

There was one instance testing a Brady Bunch look — Jess was trying to light with a blue ambience to create a color contrast with a warm key, but the first iteration of this LUT did not fully realize the subtleties of the light. So thanks to being there on set and being able to look at the nuances of Jess’ lighting with my eyes, I was able to retreat to my DI room and dissect the LUT. I added more filmic crosstalk in the tone curves and color gamut with some deeper, saturated primary colors to further separate the cooler shadows and warmer highlights, something very synonymous with 35mm film color reproduction. Jess was happy with the revised LUT, and that’s something that would have been so difficult to dial in had I not been there in person.

From my experience, dialing in the look early is so beneficial, particularly with VFX-heavy shows. VFX will live with these decisions for a long time and fine-tune their work. Any big swings later can risk breaking the compositing work.

It helps a lot to be on the ground in the early stages of production. There are lots of other things I typically get involved with. I tend to help shape the dailies pipeline, in collaboration with Marvel plates lab, and I’ll also try to establish the color grading environment so that the DP and anyone visiting the dailies room will get the best experience with our dailies colorists.

The Family Ties look

As you mentioned earlier, this show has a few different looks. Can you talk about the feel of the modern-day scenes versus the period ones?
Creatively, there’s a warmer palette inside Wanda’s world, where she tries to hold on to her happiness. This is in contrast to the cooler, more tragic reality outside in the “real world,” which is really based on the look and feel of the larger Marvel Cinematic Universe (MCU). Much of that is achieved by design in lighting, production design and costumes.

To further separate them visually, the modern-day scenes maintain a clean, natural feel, leaning into the photography. In contrast, the period looks lean into a stylization and texture, which included varying image degradation, such as grain, defocus, bloom, chroma misalignment, gate weave, etc. We reserved all this image degradation work to the DI so we could dial it in to taste. But again, this was all to complement the already amazing work done in camera. The lighting, lens choice and production design were all so well-researched and recreated on set that we had to make sure we were enhancing what was there in the “negative.”

Marvel's WandaVision

The classic MCU look

HDR also played a part in the design of these different worlds. HDR is this huge sandbox of brightness and color gamut, so we spent time trying to figure out how much of this HDR canvas we wanted to use for these period looks. To maintain the photographic quality of a 1950s print or an NTSC telecine video transfer from the 1980s, we had to limit the available dynamic range HDR can allow. But for the modern-day MCU material, we could lift the lid and expand on what HDR can offer.

In the end, our period looks were limited to between 150-300 nits’ peak brightness, and for the MCU look, we settled on 600 nits. Jess felt the 600 nit represented a highlight level that still felt filmic but really showcased the photography. Really, all this exploration was vital so that Jess could light successfully in HDR and had complete control and representation of what the final HDR image would look like.

Can you talk about working with the period looks, which resemble classic TV sitcoms?
They are all based on classic TV shows that had been a part of Wanda’s life, shows like The Dick Van Dyke Show, Bewitched, The Brady Bunch, Family Ties, Malcolm in the Middle and Modern Family. But visually for us, they were just guides; we wanted to lift elements of each to help tell Wanda’s story. For some episodes, we leaned more into the degraded quality of the original, but less for others.

The transition to color is complete

For instance, Episode 1 has quite a heavy degradation treatment to really set the audience up in this alternate TV fantasy world. But we also didn’t want the audience to become fatigued from heavy, soft, grainy images, so Episode 3 really pushes the color separation of shows like The Brady Bunch, but with minimal textural degradation. A lot of these decisions were made in DI, when we were able to see multiple episodes back to back and get a sense of how the flow of the series worked.

Most of these decisions were made collaboratively by Jess Hall, Tara DeMarco and Evan Jacobs and presented to our director, Matt Shakman, before finally being presented to the studio. This way, everyone had a chance to offer their opinion and have a balanced, thoroughly considered image.

You were essentially dealing with two different shows in one.
It was more like 10 shows in one! We not only juggled the HDR and SDR element of each look, but some episodes had multiple looks, with transition from one to another. In order to make this work, we used the ACES framework to manage our technical transforms (to HDR and SDR, for instance) and converted all our creative looks to LMTs. This avoids us being stuck under a single LUT for an episode and gave us the most flexibility.

Marvel's WandaVision

The Dick Van Dyke Show look

But even with this framework, we still had so many complex shots. Two that spring to mind are the closing shots of Episodes 1 and 3. In Episode 1, we have Wanda and Vision sitting down in their 1950s look that pulls back to reveal our modern day. In Episode 3, we have that classic black-and-white-reveal-into-color shot. Both were fantastic shots, but for them to work, we had pretty complex node structures in our coloring software, Blackmagic DaVinci Resolve. We used mattes holding out one LMT from one part of the shot, while another LMT was activated using another matte, while also carrying degradation in part of the shot. The resulting node graph in our software looked horrifying. But the implementation worked so well and gave us full flexibility.

What was WandaVision shot on? Can you talk more about how much of the look was established on set/in dailies?
WandaVision was shot almost entirely on an ARRI Alexa LF. Being on set during the beginning of the shoot gave me the opportunity to work with the DIT, Kyle Spicer, and our dailies colorist, Cory Pennington. We all worked together to calibrate what tools and controls to use at the front end so that I could take that work and continue in DI.

Marvel's WandaVision

The Dick Van Dyke Show look

We tried to limit the CDL controls to offset only and slope when needed, as it’s pretty much the same as printer points in my finishing world. I find that if a CDL is aggressively handled, it’s hard to integrate it into a structured color pipeline in the finishing suite, so I spend time remaking it. Those guys did a great job. We communicated regularly, and in the end, they laid fantastic groundwork in the DI, which meant we could focus on more of the details in DI.

What was your reference for the B&W segments? Assuming it was all shot color and made B&W in your suite.
Yes, it was shot all in color — it had to be because there were a lot of visual effects that had to take advantage of a color negative for color keys, etc. This is something a black-and-white negative would have turned into a massive rotoscoping exercise.

In Episode 1, the reference for the black-and-white look was The Dick Van Dyke Show. For Episode 2, it was Bewitched. Having the color negative was vital to the look. The production design used the same period colors that were used on The Dick Van Dyke Show. From there, we were able to build two desaturation matrices that mixed the color channels of the negative to the finished monochrome image.

The Bewitched look

From there, we emulated a print process that included a film tone curve, a warmer D55 white point, film print defocus and film grain. We then further added some telecine/analog video transfer degradation. We went pretty heavy with Episode 1. We treaded much lighter to separate the two black-and-white eras on Episode 2. This had a cooler D60 white point and a grain and defocus more akin to an interpositive film print.

How does being in-house at Marvel help in the overall process?
It offers so many advantages to the finishing process. I think the greatest advantage is that we’re working on the show exclusively. We can dedicate so much time to really developing the look and color grading. Whether it’s at the start in photography stage, during the edit to explore how the grade can help the storytelling, in conjunction with VFX to find the best solutions to the numerous challenges, or, obviously, finishing the show at the end. We can really help the whole process alongside the production, rather than jamming everything in at the end.

Marvel has also developed an incredible internal tool, appropriately called JARVIS, that connects the VFX, editorial and finishing databases. It can perform some incredibly advanced data wrangling and heavy lifting that would ordinarily take a lot of time and manual work to complete.

Marvel's WandaVision

Bridging two worlds

While that alone sounds cool (at least to me it does), the real benefit is the speed with which we can create and update cuts and sequences. We can have VFX send a list of shots they want to see and seconds later have a timeline built, complete with CDLs, previous VFX versions, underlying main plates, etc. We can go from shot requests to DI in minutes; it’s amazing.

You’ve touched on the VFX a bit, but can you discuss how you worked with that team?
The collaboration between departments on this show was so amazing. Our lead VFX supervisor, Tara DeMarco, had an insane amount of work, and we really wanted to help out where we could. We ended up holding regular grading sessions with the VFX team. They could come to us at a moment’s notice to look at shots in DI and to establish what they needed to work on and what could be left to DI.

Marvel's WandaVision

Vision VFX

A great example was the balancing of Vision’s skin color throughout the series. Traditionally, this would have fallen to VFX to meticulously go through and balance the color of Vision’s skin while creating the CGI head — a real challenge when multiple VFX vendors were working on Vision. So instead, we had VFX generate mattes for Vision’s head, crown and infinity stone, and we took on that responsibility in DI. Being able to relieve VFX of that task so they could focus on other, cooler stuff only helped to benefit the show.

It’s sometimes difficult for VFX supervisors and their vendors to interpret and turn around specific color notes. So we would often create “sketches” for the VFX supervisors, a DI grade where they can play with color live to sketch in the direction they want their vendor to go. This was particularly true for the hex wall. We would often review VFX in the DI so we could more fully explore what that story point might look like. We’ve found it has been very useful for VFX to have their own color previz ability, and it saved valuable time when communicating their intent to the multiple VFX vendors they work with.

Vision VFX

In addition to the traditional challenges on a show of this scope, you worked during COVID as well, correct?
The finish took place during the pandemic, so the entire DI was completed remotely. Marvel sent out calibrated LG C9s to Matt Shakman, Jess Hall and Tara DeMarco, and I have one here accompanying my Sony BVM-HX310. This gave me and the creative teams the flexibility to start or join a DI at any point and the confidence that we were seeing comparable images.

For the bulk of the grade, I would work here in Disney Studios in Burbank, and Jess would join me remotely when he was available. We could have a 5-minute DI or a 5-hour DI, whatever was required at the time. The flexibility was fantastic.

We also ran sessions like this for editorial and VFX. Out of the COVID restrictions have come this great ability to be flexible, which has turned into a fantastic, valuable tool.

What If …?

What have you and the Marvel team been working on recently?
I’ve since finished Marvel’s Loki series. Travis Flynn, our other colorist, finished the The Falcon and the Winter Soldier, and we’ve also collectively finished the What If …? animation series.

There’s a lot more coming down the pipe. We’re still a small department, but because of all the efficiencies that have been made internally, we’re able to achieve so much here. Since WandaVision, we can now run live HDR sessions between the dailies hub in Atlanta and our suite here on the Disney lot in Burbank. With a few button clicks, I can be looking and advising our dailies colorists in Atlanta, or I can be running sessions from here for filmmakers in Atlanta. The technical tools that are being built and deployed here create more time and paths to be creative. It’s a really exciting future.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

Assimilate DIT Pack+ Includes Hedge Backup Software

Assimilate’s new DIT Pack+ bundle is now available and includes Assimilate Live Looks for live grading, Hedge for camera-media offload and back-up, and Assimilate’s Scratch for dailies transcoding and reporting. All run on macOS and Windows.

The workflow starts with Live Looks for live grading content from any number of cameras in real time while also having the ability to add advanced effects such as greenscreen background replacement and texture effects, which can be carried over to dailies transcoding later on in Scratch. Each time when saving a grade, all the metadata, — either input by the user or delivered via the live SDI signal from all cameras — is saved in the form of a readable text document and an XML that is suited for further pipeline scripting in VFX and post. All grades and metadata are stored in an easy-to-approach folder structure that can be delivered to VFX and post easily.

After a secure backup of camera media with Hedge, the media automatically shows up in Scratch with all the footage in it, including checksum data from the offload. The user can then proceed with automated look matching as Scratch has full access to the look library and all metadata created in Live Looks during live grading. Audio-sync and QC can be done in a fast-to-operate UI with extensive PDF reports and metadata-rich dailies exports at the end of the chain.

“When the Assimilate team reached out to us about an integration, it was a no-brainer,” explains Paul Matthijs Lombert, CEO at Hedge. “With Hedge as the starting point for many productions, it now elegantly hands off the media and metadata to dailies transcoding, greatly simplifying the process for those that use the tools day to day.”

Available immediately, DIT Pack+ starts at $149 monthly and $899 annually, with a permanent license priced at $1,499.

 

Director Ron Howard Talks Hillbilly Elegy and Remote Post

By Iain Blair

Oscar-winner Ron Howard is one of Hollywood’s most beloved and versatile directors and producers. Since his directorial debut in 1977 with Grand Theft Auto, he’s made an eclectic group of films about boxers, astronauts, mermaids, politicians, mathematicians and more, so it was probably only a matter of time before he directed his attention to hillbillies.

Based on the J.D. Vance memoir of the same name, Howard’s latest film, Hillbilly Elegy, is a modern exploration of the American dream — and the nightmare of poverty and opioid addiction. A former Marine and current Yale Law student, J.D. (Gabriel Basso) is on the verge of landing his dream job when a family crisis forces him to return home to his family in Appalachia. He has to contend with his volatile mother, Bev (Amy Adams), who’s struggling with addiction, while being fueled by the memories of his grandmother Mamaw (Glenn Close), the whip-smart woman who raised him.

Howard, who assembled a crew that included DP Maryse Alberti; editor James Wilcox, ACE; and composers Hans Zimmer and David Fleming, recently talked with me about directing the Netflix film and his love of post.

What was the appeal of doing this, and what sort of film did you set out to make?
I was drawn to the characters and region because in many ways, my own family heritage aligned with J.D.’s — small town, rural Oklahoma for me — and I’d been looking for a contemporary family story for a long time, one based in these areas with that particular cultural sensibility. I found much of that in J.D.’s book, and as we talked more and I met his family, that promise deepened.

“A” camera/Steadicam operator Christopher TJ McGuire and DP Maryse Alberti

How tough was the shoot?
It was very hot and very fast. We shot a lot on location in Georgia and Ohio then shot interiors at Screen Gems in Atlanta.

Can you talk about the look you and DP Maryse Alberti went for?
We wanted a very naturalistic look with a vibrant palette and an unselfconscious approach. Maryse is very fast. We did a lot of setups every day and covered a lot of emotional territory, and I think her documentary background was important.

James Wilcox cut this.  How did you work together, and what were the main editing challenges?
James, who’s done a lot of TV, cut Genius for me. My longtime editors Dan Hanley and Mike Hill had decided to retire, so James came back for this, and he’s a great fit. He’s also going to cut my next film.

He was based in New York, and we cut traditionally. He didn’t come to Georgia on location, but we stayed in touch and he got dailies and sent me scenes as he assembled them. He’s very strong with performance and very cinematic and creative. In fact, it was his idea to use the preacher at the beginning. It wasn’t scripted. It was something he found and tried, and we all loved it.

The idea of using family photos at the start was something he also just ran with, and like everyone else on the project, he forged this personal relationship with the material and the challenges of family life, which made me realize just how universal this story is.

I know you like to screen a lot during editing. I assume you did that with this film as well?
A lot — and lots of small screenings, just 20, 30 people — so we could really drill down and find out what was connecting and what wasn’t. We listened to audience members talk about it, and the Netflix team was so supportive when I wanted to go back and shoot a couple of new scenes and refine some others.

We shot another four, five days in Atlanta, and when we wrapped in mid-March, the whole film industry basically shut down due to COVID. And by some miracle, no one on the set got it.

Post must have been a big challenge?
Yeah, as it was all remote. I really love post, but I’ve never done a post like this before, and I wasn’t there in person as usual for any of the ADR, and that was a bit unsettling.

But luckily the movie was in good shape before we began post, even with that new footage we had to deal with. We did it all in New York. James and his assistants all worked out of home. Harbor Post did the editorial and mix, and sound designer Grant Elder and supervising sound editors Bob Hein and Josh Berger did a lot of very detailed work on the sound.

Company 3 did all the dailies for us and the DI – all remote. Tim Stipan was the colorist, and the sessions were kind of weird, with everyone masked and staying 15 feet apart — same with the mix. We also did one or two big test screenings with a regular audience, and I did another focus group, but fewer than normal, and COVID made it all very difficult.

This is your ninth collaboration with composer Hans Zimmer, who was joined by co-composer David Fleming. Can you talk about the importance of the music?
It’s always such a crucial part of post, and I really wanted a score that took the audience on a journey and that was really about these characters and not one with the obvious regional sounds you might expect from the title.

I didn’t want it to sound anything like Deliverance, for instance, and they did an amazing job, although Hans got COVID and we had to do it all remotely. And appropriately we mixed at Hans’ Remote Control in Santa Monica.

There are quite a few VFX by Crafty Apes. What did that entail?
The big one was the factory where we shot because it wasn’t a steel mill, so we used it as the bones, and then we digitally designed the whole steel mill. We also extended the town a bit, and that helped with the different eras. And we didn’t use real fire with the kids in that frightening flashback and, of course, we had a fair amount of cleanup and period fixes, and Crafty Apes did a great job.

You’ve made several films based on true life stories, so you know translating any book to a visual medium is always tricky. How challenging was it?
You’re right, it’s always a challenge, and when you condense a book like this, there’s always some creative license and changes you have to make to tell the story in a compelling way. But my goal’s always to portray what I saw as the truth of the story, and as a director, you have to have control of all that.

I discussed all that with J.D., his mother and sister and other family members, and their greatest concern was that we got Mamaw all right, with all her inconsistencies, warts and all. Her memory was very meaningful to their lives, and they were blown away when they saw what Glenn Close was doing with the character.

Ron Howard and writer Iain Blair

All Glenn had to work with was the conversations she had with them and a few home movies, most of which we used in the title sequence, along with photos, some audio tapes and so on. But the family was so appreciative and moved — and emotionally shattered by Glenn’s performance. It was almost like Mamaw was back for minute.

There’s been quite a lot of criticism about the film’s approach to some of the controversial and more political aspects of the issues. How do you respond?
Well, even the book’s title was immediately controversial and thought-provoking due to another aspect of the memoir, which is more geopolitical — and that’s not what I felt a film adaptation could really service and address.

What really interested me was focusing on the family and staying true and authentic about this family, their world and their journey, and then allowing that to be a window into some of these issues, instead of dealing with them in a more academic way.

What’s next?
I’m doing Thirteen Lives, about the famous Thai cave rescue of the soccer team kids. We’ll shoot in Australia and maybe in Thailand, and we plan to start end of March, assuming the pandemic’s more under control by then.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Assimilate Scratch 9.3: Remote Workflows, Calibration, More

Assimilate has announced Scratch 9.3 and Play Pro 9.3, targeting production and post workflows. Both new versions integrate Newtek NDI, Light Illusion ColourSpace CMS and Time in Pixels Nobe Omniscope to provide more efficiency in the camera-to-post pipeline while working in the “new normal” of distributed/remote workflows.

Working Remotely
For remote workflows, Assimilate has embraced support for Newtek’s NDI, which allows users to easily and directly connect Scratch to various apps such as SetStream.io, OBS or Skype and Zoom — no need for cables or wires. You can see a demo here.

“Scratch 9.3’s native NDI integration adds speed and flexibility to my streaming sessions without the need to purchase I/O hardware or LUT boxes,” says colorist Jeff Sousa from NYC’s Dungeon Beach. “It allows Scratch to uplink directly to SetStream.io via the WebRTC protocol, giving me latency-free interactions with my clients and great image quality. And since NDI is ‘just another output’ to Scratch, I can control exactly what the client sees, toggling masks on and off, and even apply a stream-only display LUT to calibrate their viewing device remotely. When streaming via NDI, I can also round-trip to After Effects for rotoscoping or Avid Media Composer for conform and the client won’t lose signal on their program.”

Calibration
Light Illusion’s ColourSpace patch generator is now also integrated natively in Scratch and Play Pro 9.3 to provide 100% accuracy in 32-bit patch generation for monitor profiling and a consistent color pipeline from camera to post.

“The ability to directly integrate ColourSpace in Scratch enables direct profiling and therefore accurate calibration of any attached display,” reports Steve Shaw from Light Illusion.

Scopes
Time in Pixels’ new scoping software, Nobe OmniScope, is now integrated into the Scratch tools so that both can run in parallel on the same workstation without the need for an external image-analyzer device or additional video I/O hardware.

“Because of [Scratch’s] well-thought-out interface for third-party solutions, we were also able to easily add support for Assimilate’s other tools, such as their live-grading software Live Looks and their reference player Play Pro, which is used widely in video QC,” says Tom Huczek, founder and lead developer at Time in Pixels.

Version 9.3
Scratch now includes a new CIE plot diagram in its internal scopes and has improved ProRes Raw decoding capabilities. DITs can precisely define how Raw footage should be processed in Scratch by setting Raw media defaults upfront in the project settings.

Additionally, Assimilate has rebranded its online review tool Scratch Web to Dailies Online and has doubled the available cloud storage for all users at no additional charge. By using Scratch’s recently introduced on-screen annotation features, DITs and finishing houses can share dailies and increase their productivity in VFX review sessions.

Especially helpful in VFX-workflows is Scratch’s ability to read and forward per-frame (lens-) metadata, including recently extended support for ARRI Raw and ProRes MXF to cover Red Raw as well. Along with native support of OpenColorIO, this makes for the most ideal VFX dailies-and-review package to date.

Creative Reboot
Following on its Creative Boost program, which provided free access to all Assimilate products for the past six months, Assimilate has announced Creative Reboot, which provides discounts up to 50% on Assimilate products. The new prices are effective immediately and through December 31 from the Assimilate web store.

With Creative Reboot, Scratch 9.3 starts at $59 monthly and $495 annually. Play Pro 9.3 starts at $10 monthly and $99 annually.

Moxion Offers Dolby Vision for Remote HDR Reviews

Moxion Immediates instant dailies service platform now integrates Dolby Vision, Dolby’s advanced imaging technology that combines high dynamic range (HDR) with wide color gamut (WCG) capabilities.

With the ability to monitor in Dolby Vision for remote cuts, color, dailies and VFX review, creatives can make decisions based on the image the director, colorist and cinematographer envisioned. Dolby Vision support will be available to users with compatible devices through the Moxion app on iOS, iPadOS and tvOS.

Increasingly, productions are shooting and monitoring on set in HDR, which is fueled by demand from leading content providers and streaming services. With many films and TV shows being shot simultaneously across multiple units and countries, being able to monitor in Dolby Vision enables robust feedback and certainty of creative intent across all units.

“Your Dolby Vision assets can be ingested, transcoded and played back with the secure, high-speed Moxion ecosystem,” says Hugh Calveley, CEO of Moxion. “Dolby Vision, plus Moxion’s ability to make frame-accurate annotation and comments, gives certainty to the colorist, editor and VFX team that feedback is against their original creation.”

Moxion’s integration of Dolby Vision — which offers lifelike picture quality with highlights that are up to 40 times brighter and blacks that are 10 times darker than a standard picture — into its platform will provide the production, post and visual effects communities with unmatched control over the image. Unlike HDR10, Dolby Vision allows DPs and colorists to adjust details for individual scenes on a frame-by-frame basis down to the exact frame.

The need for such a solution is even more urgent during production in the current pandemic. Reduced numbers on set and remote distributed workflows are the new normal, and Moxion’s support for Dolby Vision helps ensure that all participants are seeing identical images.

Calveley adds, “COVID has forced the industry into innovating ways to sign off on cuts, VFX and color grades in scenarios where you can’t bring people into the same room. Dolby Vision guarantees that everyone is making creative decisions based on the same image.”