By Randi Altman
Color workflows continue to evolve, starting on-set and migrating to the grading suite. Other trends include AI being used for repetitive tasks and migration to the cloud. For this virtual roundtable, we reached out to colorists from studios such as Warner Bros., Marvel and more to find out what their color pipelines are like, how they prefer to work, and films and shows inspire them. We also spoke to those who make the tools those pros use, asking about their offerings and what’s on the horizon.
Marvel Studios’ Senior Finishing Colorist Matt Watson
Marvel Finishing supports all of Marvel’s streaming content for Disney+. “We design looks and color pipelines for on-set, provide color support for editorial, VFX and marketing throughout post before producing the final grade for Disney+.” Their most recent projects include Werewolf by Night, The Guardians of the Galaxy Holiday Special, Secret Invasion, and the company is currently working on Season 2 of Loki and What If…?
What does your setup look like tools wise?
We use Blackmagic DaVinci Resolve for our central hub of finishing tools as well as Boris FX’s Sapphire and Neat Video. We have also developed our own proprietary tool called Jarvis that can take control of Resolve to perform many editorial tasks, CDL management and media management and can pull on the vast amount of metadata that exists within Marvel to streamline our workflow.
Do you work in the cloud? If so, can you describe that workflow and the benefits?
Marvel maintains its own private cloud infrastructure and leverages public cloud as needed. Specifically for Marvel Finishing, we don’t rely on cloud for most of our workflows, but it’s something we’re keeping an eye on for the future.
Do you often create LUTs for a project? How does that help?
Yes, we create show LUTs for every project. To be specific, we design our looks within the ACES framework as LMTs, which represent the visual characteristics that the filmmakers want — contrast, tonality, gamut mapping, etc. — while CDLs are used and tracked throughout the project for per-shot color choices.
We then use and modify that same ACES framework to manage our HDR and SDR transforms. All of Marvel’s streaming productions have had both HDR and SDR monitoring on-set. We send out multiple HDR LUTs that restrict or allow more information so that cinematographers can make informed HDR lighting decisions on-set. We also see large volumes of VFX in our shows, so our LUTs ensure a predictability to the image rendering so that VFX artists are not fighting a heavy or destructive color swing in the LUT.
Do you use AI as part of your daily job? In what way?
While there’s some interesting work happening with AI-assisted color grading, it wouldn’t be a good fit in our ecosystem. Resolve has some interesting tools that have a machine learning component to them, but Marvel Finishing is not using AI in the way people are talking about it today.
Our software development team has done an awesome job with our proprietary Jarvis tool, with the mission of making our workflow as efficient as possible. We’re really working to empower our colorists by putting relevant information at their fingertips. Our focus is on quality, and I have not yet seen any AI tools that have helped improve the quality of work. Speed maybe, but not quality.
If you are given an example from another show or piece of art, what is the best way to emulate that?
First is to understand exactly what individual elements they want to recreate. Is it just the tonality, color rendering or texture? Or is it a faithful emulation of a particular show? We can only emulate the rending of the captured image, so any true emulation requires both authentic lighting and set design in conjunction with our emulations.
Discussing all this prior to shooting is probably the most important factor. Both WandaVision and Werewolf by Night were great examples where production used authentic lighting styles and production colors in conjunction with our looks to create authentic emulations of historical genres. For Werewolf, we filmed out various tests to print stock, which were scanned and digitally emulated. The final show is a mix of film-out and digital emulation, although it’s almost impossible to spot which is which. Similarly, on She-Hulk, we had a dream sequence based on recreating the Hulk television show opening from the ‘70s. That was another case where our target was really specific and followed the same principles.
Do your workflows include remote monitoring?
We often have large audiences for our color sessions, so it’s inevitable that some people will be remote. We use ClearView Glow, which allows us to send an HDR feed to filmmakers all over the world. We recommend people watch using the latest iPad Pro, which is a very good match for our reference displays. However, I will always prefer to have people in the room. So much of our job is reading the room; it’s much harder when that room is a small box on a screen.
What film or show or spot resonates with you from a color perspective?
I recently rewatched one of my all-time favorite films, Lawrence of Arabia, on a 70mm film print. The print has such incredibly vivid colors, and it just looks beautiful but is never overpowering. It’s a reminder that color should be harmonious to the underlying images and story but should never draw attention to itself.
Colorfront’s Brandon Heaslip
Colorfront is a software development company in the field of high performance digital processing technologies for the motion picture industry.
What tools do you offer for the color grading pipeline?
While Colorfront doesn’t currently offer a stand-alone color-grading product, we do offer a suite of tools to augment the broader color-grading process from capture to delivery. For example, universal show “looks” can be designed to apply across different cameras and delivered to multiple display specifications from SDR to HDR with perceptually matching results.
Our color tools are built on the foundation of the Colorfront Engine’s Perceptual Processing Technology, which includes robust color management applications and features. The Colorfront Engine (CFE) supports Colorfront’s efficient single-master workflow concept — one grade to multiple deliverables. In default mode, the system wakes up with a beautiful image, allowing colorists to spend their valuable time focusing on the image aesthetic.
Onset Dailies and Transkoder offer specific grading tools, such as basic printer light adjustments, advanced curve correction, look blending and skin tone correction as well as SDR to HDR, HDR to SDR and conversions between industry color spaces. Colorfront also provides full support for Dolby Vision analysis, metadata editing, validation and XML creation.
How does your solution embrace ACES workflows?
Colorfront’s products are designed to seamlessly integrate with and enhance ACES workflows, offering robust color management tools, accurate color transformations and comprehensive support for ACES color space throughout the production and post processes.
The full resources of the Colorfront CFE are available as an ACES Look Modification Transform (LMT), allowing full integration of Colorfront’s color management tools. We continue to work closely with the Academy of Motion Picture Arts and Sciences to ensure our products support the latest ACES developments, such as ACES Metadata File (AMF) and the Common LUT Format.
What trends are you seeing, and what does the future hold in relation to color?
Continued advancements in AI will likely augment the color-grading process into the future, saving colorists valuable time. However, it’s important to note that AI is not a replacement for skilled human colorists, whose expertise and creative decision-making are essential in establishing the artistic vision.
How does your solution use AI? Should people be worried about AI or embrace it to help with repetitive tasks?
Our current focus with AI is to explore how it might be integrated in our products to create more efficient workflows. We’re always looking for ways we can help our customers save time and streamline their processes. So if AI can help our products and our customers, we will look to embrace it.
Are your users working remotely?
In recent years, our customers have increasingly adopted a hybrid work mode, whereby they transition between remote locations such as home offices, local or distant offices, private data centers and cloud infrastructures. This shift has not only allowed them to access the finest available tools but has also expanded their reach to talent worldwide. As a result, streaming technology has become an integral part of their workflows.
At Colorfront, we recognize the growing importance of streaming in our customers’ operations. To address this need, we have built native streaming capabilities into all of our products. This integration empowers our users to deliver reference-quality images to anyone anywhere in the world, ensuring the highest level of quality and fidelity. Whether it’s reviewing dailies, conducting quality control assessments or enabling remote collaboration, our native streaming solution ensures that the visual content is transmitted seamlessly, maintaining its pristine quality throughout the production pipeline.
How are you addressing workflows moving into the cloud?
For nearly a decade, Colorfront has been at the forefront of cloud adoption, and we have successfully developed our entire product suite, including Transkoder, On-Set Dailies, Express Dailies, QC Player and Streaming Server, to operate natively within AWS. Our commitment to using the power of the cloud has brought numerous advantages to our customers.
One notable benefit of using our products in the cloud is our ability to seamlessly read all major camera formats, image sequences and IMF/DCP files directly from S3. This capability empowers our customers to work swiftly and cost-effectively, streamlining their workflows. By eliminating the need for time-consuming data transfers and complex file conversions, we enable our users to focus on their core tasks and meet their deadlines efficiently.
With our cloud-native approach, users can leverage the scalability, reliability and robust infrastructure provided by AWS. This ensures that their projects can handle any scale or complexity with ease. Whether it’s managing large volumes of data or accommodating sudden spikes in demand, our solutions are designed to adapt and deliver optimal performance.
Warner Bros. Post Creative Services Colorist John Daro
Warner Bros. Post Production Creative Services is a post house on the Warner Bros. lot in Burbank. “We specialize in feature films and high-end episodic projects, with picture and sound finishing under one roof. We also have editorial space and visual effects offices just one building over, so we truly are a one-stop shop for post.”
What does your setup look like tools wise?
I have been a devotee of FilmLight’s Baselight for the past five years. It is the beating heart of my DI theater, where I project images onto a 4K Christie projector and monitor them on two Sony X300s. For that “at-home” consumer experience, I also have a Sony A95K.
Although I spend 90% of my time on Baselight, there are a few other post software necessities for my craft. I call my machine the “Swiss army box,” a Supermicro chassis with four Nvidia A6000s. I use this machine to run Resolve, Mistika, Photoshop and Nuke. It also makes a fine dev box for my custom Python tools.
I always say, “It’s not the sword, it’s the samurai.” Use the right tool for the right job, but if you don’t have the right tool, then use what you’ve got.
Do you work in the cloud? If so, can you describe that workflow and the benefits?
Not really. For security reasons, our machines are all air-gapped and not connected to the outside world. One cloud tool I do use quite a bit is Frame.io, especially for the exchange of notes back and forth. I really like how everything is integrated into the timeline. It’s a super-efficient way to collaborate. We also archive finished projects and raw scans to the cloud, but that’s a different story.
I do think cloud workflows are gaining steam though, and I definitely have my eye on the space. I can envision a future where we send a calibrated Sony X3110 to a client and then use Baselight in the cloud to send JPEG XS straight to the display for remote approvals. It’s a pretty slick workflow, and it also gets us away from needing the big iron to live on-prem.
Working this way takes geography out of the equation too. I would love to work from anywhere on the planet. Bring on the Tiki drinks with the little umbrellas somewhere in the tropics with a laptop and a Mini Panel. All joking aside, it does open the talent pool to the entire world. You will be able to get the best artists regardless of their location. That’s an exciting prospect, and I can’t wait to see what the future holds for this new way of looking at post.
Do you often create LUTs for a project? How does that help?
I mostly work with curves and functions to do my transforms, but when on-set or editorial needs a preview of what the look will be in the room, I do bake LUTs out. They are especially critical for visual effects reviews and dailies creation.
There’s a film project that I’m working on right now. We’re doing a scan-once workflow on that show to avoid overly handling the negative. Once scanned, there is light CDL grading, and a show LUT is applied to the raw scans to make editorial media. The best looks are the ones that have been developed early and help to maintain consistency throughout the entire workflow. That way, you don’t get any surprises when you get into the final grade. Temp love is a thing… LUTs help you avoid loving the wrong thing.
Do you use AI as part of your daily job? In what way?
I do use a bit of AI in my daily tasks, but it’s the AI that I’ve written myself. Originally, I started trying to make an automated dust-buster for film restoration. I failed miserably at that, but I did learn how to train a neural net, and that led to my first helpful tool. I used an open-source image library to train an AI up-rezzer. Although this is commonplace now, back then, it was scratching an itch that hadn’t been scratched yet. To this day, I do think my up-rezer is truer to the image and less “AI”-feeling than what’s available off the shelf.
After the up-rezer, I wrote Match Grader in 2020, which essentially takes the look and vibe from one shot and applies it to another. I don’t use it for final grading, but it can be very useful in the look-dev process.
Building on what I had learned coding Match Grader, I subsequently developed a process to use machine vision to create a depth channel. This turns your Power Windows from circles and squares into spheres and cubes. It is a very powerful tool for adding atmosphere to images. When these channels are available to me, one of my favorite moves is to desaturate the background while increasing the contrast in the foreground. This adds dimension to your image and helps to draw your eye to the characters where it was intended.
These channels can also aid in stereo compositing, but it’s been a minute since I have had a 3D job cross my desk that wasn’t for VR.
Lately, I have been tinkering with an open-source library called YOLO (You Only Look Once.) This software was originally developed for autonomous driving, but I found it useful for what we do in color. Basically, it’s a very fast image segmenter. It returns a track and a matte for what it identifies in the frame. It doesn’t get everything right all the time, but it is very good with people, thankfully. You wouldn’t use these mattes for compositing, but they are great for color, especially when used as a garbage matte to key into.
I have also recently refreshed my AI uprezer. I built in some logic that is somewhat “intelligent” about the source coming in. It can auto-detect interlace and cadence now and can perform a general analysis of the quality of the picture. This allowed me to throttle the strength and end up with the perfect amount of enhancement on a case-by-case basis. The new tool is named SamurAI.
If given an example from another show or piece of art, what is the best way to emulate that?
It’s good to be inspired, but you never want to be derivative. Often, we take many examples that all have a common theme or feeling and amalgamating them into something new.
That said, sometimes there are projects that do need a literal match. Think film emulation for a period effect. People can approach it in two ways. First — the easiest way while also being more complicated — is to get a hold of some of the stock that you are emulating. Next, you expose it with color and density patches and then develop and measure the strip. If you read enough points, then you can start to interpolate curves from the data. FilmLight can help with this, and back in my lab days, that is exactly whose software we used. Truelight was essential back in the early days of DI, when the “I” was truly the intermediate digital step between two analog worlds.
The second way I approach this task would be to use my Match Grader software. I can push the look of our references to some of the production footage. Match Grader is a bit of a black box in that it returns a completed graded image but not the recipe for getting there. This means the next step would be to bring it into the color corrector and match it using curves, keys and scopes. The advantage of doing it this way instead of just matching it to the references is that you are working with the same picture, which makes it easier to align all the values perfectly.
Oh, you can also just use your eyeballs.
Do your workflows include remote monitoring?
Not only do they include it, but there was a time in the not-too-distant past when that was the only option. We use all the top solutions for remote sessions, including Streambox, Sohonet ClearView, Colorfront and T-VIPS. The choice really comes down to what the facility on the catching side has and the location of the client. At the moment, my preference is Streambox. It checks all the boxes, from 4K to HDR. For quick approvals, ClearView is great because all we need on the client side is a calibrated iPad Pro.
What film or show or spot resonates with you from a color perspective?
Going back to my formative years, I have always been drawn to the austere beauty of Gattaca. The film’s use of color is simply flawless. Cinematographer Sławomir Idziak is one of my favs, and he has had a profound influence on my work. I love the early flashback, in particular. I have been gravitating in that direction ever since I saw the picture.
You can see a bit of Gattaca‘s influence in my own work on Steven Soderbergh’s Magic Mike and even a little bit on the animated film The Sea Beast, directed by Chris Williams.
I am always looking for new ways to push the boundaries of visual storytelling, and there are a ton of other films that have inspired me, but perhaps that’s a conversation for another time. I am grateful for the opportunity to have worked on projects that have allowed me to explore my own artistic vision, and I hope that my work will continue to evolve, inspire and be inspired in the years to come.
FilmLight’s Peter Postma
FilmLight develops color grading systems, image processing applications and workflow tools for film and video post.
What tools do you offer for the color-grading pipeline?
FilmLight is best known for its Baselight grading system, which offers a comprehensive set of tools for color grading, finishing and conforming. But FilmLight offers tools for every stage of a color-grading pipeline: previsualization and look development with Baselight Look, live on-set grading with BLG Tools, dailies processing with Daylight and editorial and VFX integration with Baselight for Avid, Nuke and Flame.
How does your solution use AI? Should people be worried about AI or embrace it to help with repetitive tasks?
Baselight 6.0 provides a face-tracking tool and an optical flow retimer as the first of many tools to use machine learning/AI. We aim to integrate such tools into Baselight in a way that keeps colorists and filmmakers in control of the image while enabling them to deliver looks that would otherwise be impossible or impractical given time and budget constraints.
What is beyond HDR?
There are lots of ways to provide higher-fidelity images – super-high resolutions greater than 8K, high frame rates, HDR with wider color gamuts and high-contrast range. Not all of these make sense everywhere. For example, there is probably no need for an 8K phone screen, but it is needed for special large venues. Likewise, not every project needs to take advantage of these different formats, but we support filmmakers who push the limits to widen the palette of options they have for telling their stories.Do you offer color in the cloud?
Yes, we’ve worked hard to ensure Baselight can be used in the cloud, with an experience that is as seamless and interactive to the colorist and the creative team as working with local hardware.
Are your users working remotely?
Yes, most often colorist are working locally and streaming to a remote client, but we also have colorists who work remotely from the hardware running Baselight (either via cloud service providers or a private post facility). We’ve integrated image streaming directly into Baselight so our customers are not reliant on third-party tools to enable remote workflows.
How does your solution embrace ACES workflows?
Support for ACES is fully integrated into all our software. FilmLight was active in the early development and testing of ACES more than 15 years ago and continues to support updates and participate in ongoing development with the ACES committee.
What trends are you seeing, and what does the future hold in relation to color?
As the traditional lines between what you can do in a color-grading suite versus a VFX house have blurred over the years, the lines between preproduction, production and post also continue to blur. Color work can begin as soon as assets are produced for use in virtual production, or even before. We at FilmLight provide the tools to help all the different departments involved in creating the image collaborate effectively and work toward the common goal of bringing the filmmakers’ visions to screens.
Goldcrest Post Colorist Alex Berman
Goldcrest Post in NYC provides post for episodic television, narrative feature films, indies and documentaries. It serves major studios, such as HBO, Netflix, Apple TV+, Amazon Prime and Hulu, as well as the traditional broadcast networks.
“It’s a special place. It’s an exquisite boutique yet at the same time a powerhouse post facility. It starts from the top. Our owner, Nick Quested, is a filmmaker himself.”
What does your setup look like toolswise?
I use a Linux operating system, and I grade on the DaVinci Resolve with the Advanced Panel.
Do you work in the cloud? If so, can you describe that workflow and the benefits?
I am not currently working in the cloud…my head is clear. But seriously, we are looking into it, and it will likely be incorporated into our workflows soon.
Do you often create LUTs for a project? How does that help?
I do create LUTs. At Goldcrest, we have our own collection of proprietary LUTs: input, output and look modifiers. I love my look modifiers, and so do my clients. LUTs are designed to help the colorist, but there are certain instances when you can achieve what you are going for without the use of LUTs. I’ve seen other excellent colorists work with and without LUTs. The final image is what matters.
For example, I recall a moment in my career when a (very high-end) vendor was testing multiple top colorists in New York to award a big account. They gave us each the same reel of a narrative film to grade however we thought best. They already had a relationship with one of the companies, so to avoid giving them any further advantage when they presented the work to the executives, the vendor kept each submission anonymous. Simply 1, 2, 3 and 4.
I won, and the company I worked for was awarded a great deal of work after that, not just for me but for other departments as well. And guess what – I did not use a LUT for that one.
Do you use AI as part of your daily job? In what way? Helping with repetitive tasks?
I do not use AI. I am reading and exploring different services, so perhaps in the future I will.
If given an example from another show or piece of art, what is the best way to emulate that?
An example from another show or a piece of art is a good visual representation that can help a conversation and help explore where to go with the imagery I have in front of me. Many of the images I make are illusory; others are based on real life. There are so many ideas, so a visual reference can serve as a guide to what a client had in mind.
Do your workflows include remote monitoring?
Yes, my workflows sometimes involve remote monitoring. Just this past week, we sent an iPad to a client in Los Angeles and had multiple live sessions using Streambox. On another recent occasion, I was doing a P3 theatrical grade in New York and had my client simultaneously set up in a theater in Prague.
What film or show or spot resonates with you from a color perspective?
I enjoyed the color in Tron: Legacy. Tons of others, of course… the technical precision of The Irishman comes to mind. Hugo was beautiful. There are so many I could mention for various reasons. Recently Top Gun: Maverick was a winner too.
Blackmagic Design’s Bob Caniglia
Blackmagic Resolve offers editing, color, visual effects, motion graphics and audio post in one software.
What tools do you offer for the color grading pipeline?
DaVinci Resolve combines editing, color correction, visual effects, motion graphics and audio post production all in one software tool.
It has been used to color and finish more high-end feature films and television shows than any other system and is being used by everyone from students and online creators to professional Hollywood colorists.
DaVinci Resolve is now in version 18.5 and is available as a free version, or you can get DaVinci Resolve Studio for $295. We also offer a number of hardware control products including:
- DaVinci Resolve Micro Panel, a low-profile panel that features three high-resolution trackballs and 12 precision-machined knobs to access the primary color-correction tools. It also includes buttons for common features and workspace navigation.
- DaVinci Resolve Mini Panel which includes the features of the Micro Panel model plus two LCDs with menus and buttons for switching tools, adding color corrector nodes, applying secondary grades and using Power Windows.
- DaVinci Resolve Advanced Panel, which features a massive number of controls for direct access to every DaVinci color-correction feature.
How does your solution use AI? Should people be worried or embrace it to help with repetitive tasks?
First, let me say that I welcome our AI overlords. Sorry, I kid. You should not be worried. You don’t need to fear AI in the post production world. I think a lot of that comes from people worried about AI in general. But that is a different topic than people worrying that AI in post production will take their jobs.
Using AI in Resolve is not about how the software can create things for you. It is about how AI can support creatives. Resolve has had AI for years now, ever since we announced the DaVinci Neural Engine in 2019.
At its core, the DaVinci Neural Engine provides simple tools to solve complex, repetitive and time-consuming problems. For example, to do things like reframe shots, it enables facial recognition to sort and organize clips into bins based on people in the shot.
Resolve uses AI to recognize where a colorist’s eyes are and isolates objects in images. It can look at a picture and distinguish between eyes and cheeks or recognize a dog versus the kid walking the dog. It can detect all these different things using AI to work out all the pieces and then create Windows for them. Then you can color-correct all the elements of the Window. This can save a huge amount of time and let the colorist focus on creating.
Some of the AI tools that colorists can use with DaVinci Neural Engine include our new Relight FX tool, which lets you add virtual lighting to a scene to creatively adjust environmental lighting, fill dark shadows or change the mood. There is also the new SuperScale upscaling algorithm that creates new pixels when increasing the resolution of an image. And there are a number of other AI features around facial recognition, object detection, smart reframing, speed warp retiming, auto color and color matching.
What is beyond HDR?
The XDR technology that Apple is driving with their Pro Display XDR technology is very interesting. Seeing an image’s true color and working in a format that is similar to how human eyes see an object is very exciting. But HDR and SDR workflows are still going to be around and needed for a long time.
Do you offer color in the cloud?
DaVinci Resolve has been able to offer color correction in the cloud since 2022, when we announced version 18 and its support of Blackmagic Cloud. This allows cloud-based workflows to collaborate remotely. You can host project libraries using Blackmagic Cloud and collaborate on the same timeline, in real time, with multiple users globally.
With Blackmagic Cloud, you simply create a Blackmagic Cloud ID to log into the DaVinci Resolve Project Server and set up a project library for your project. Then any number of collaborators can be assigned to a project using Blackmagic Cloud to share projects. Multiple people can work on the same timeline, and when changes are made, you can see and accept them in the viewer. Changes are only applied when you accept updates. Also, built-in timeline-compare tools let you merge changes into a master timeline so others can continue with edits.
You can collaborate with editors, colorists, visual effects artists and sound engineers all working together at the same time. And you no longer have to import and export files, translate projects, lose work or conform and manage changes.
Are your customers working remotely? Remote monitoring and color?
Absolutely. DaVinci Resolve has been used both on-set and remotely for years. More recently, in 2022, we announced DaVinci Resolve for iPad, so creators can extend video workflows in new ways and new places. This new version is optimized for multi-touch technology and Apple Pencil. It features support for cut and color pages and can take advantage of Blackmagic Cloud.
For remote monitoring, one of our newest features in 18.5 is the ability to initiate remote monitoring using just a Blackmagic ID and a session code. You can enable remote monitoring in DaVinci Resolve and share the code without having to deal with IP addresses and port forwarding. Users can stream to multiple computers, iPads or even iPhones all at the same time. You can also export a timeline to the Blackmagic Cloud using our new Presentations feature. With Presentations, multiple people can review their timeline, leave comments and live-chat. Comments will appear as markers on their DaVinci Resolve timeline, allowing users to act on feedback quickly.
How does your solution embrace ACES workflows?
DaVinci Resolve supports our own color management and ACES. To set up an ACES workflow in Resolve, open project settings from the file menu and click on color management. You get options for setting up color science along with input, timeline and output color transformations, tone mapping and LUT application.
What trends are you seeing, and what does the future hold in relation to color?
I think most people in the industry will say AI is one of the hot trends, and there is truth in that. But AI has been part of color correction for years, so that is no great surprise. Greater collaboration and use of color in the cloud is another trend.
The future will be color everywhere, with everyone in the post pipeline and DPs, DITs and others on-set working with or at least understanding color correction. The days of invisible walls separating edit, color, audio and VFX are over. Ever since we began offering all of those in DaVinci Resolve, and for free, post pros have been able to learn all of them easily. And even if they never consider themselves expert colorists, the basic use and understanding is there.
Combine that with the increasing use of cloud-based workflows like Blackmagic Cloud and being able to work in real time with things like DaVinci Resolve for iPad, and we will keep seeing color correction everywhere.
Nice Dissolve Colorist Joseph Mastantuono
is an independent producer and colorist who works with Nice Dissolve in Brooklyn. “We specialize in independent cinema, both narrative and documentary. Two recent films that I have worked on are The Feeling That the Time for Doing Something has Passed, which premiered at Directors’ Fortnight at the Cannes Film Festival, and Our Father, The Devil, which premiered at the Venice Film Festival, won the Audience Award at Tribeca and was nominated for best feature at this year’s Indie Spirit awards.”
What does your setup look like toolswise?
We have a 4K color-grading theater in Brooklyn, and I have a home studio with a reference display as well. I grade with DaVinci Resolve, and I love Blackmagic’s Mini Panel.
Do you work in the cloud? If so, can you describe that workflow and the benefits?
We use the Blackmagic Cloud for project syncing. That, along with virtual drive-mapping, means I can easily work from home, and projects stay completely in sync.
Do you often create LUTs for a project? How does that help?
I like creating look LUTs for dailies so the DP and editor get an approximation of what the final project will look like, but we usually ditch them for the final grade.
Outside of the color space transform LUTs that you use for technical reasons, I think of most LUTs as store-bought tomato sauce. Sure, it will make a decent meal in a pinch, but if you want something really special, you want to spend the time making it from scratch.
Do you use AI as part of your daily job? In what way?
I have yet to dabble too much with AI outside of some experiments. I think there are going to be useful tools, but I think that we often think productivity is everything. The reality is that productivity adds speed, and sometimes when we go too fast, we get into trouble.
If given an example from another show or piece of art, what is the best way to emulate that?
I often don’t think about how to match it exactly, but I try to drill down with the director and cinematographer into what exactly about that piece of art or film speaks to them. A big part of my job is to develop an aesthetic language with them.
It’s productive to have an emotional conversation with my clients. I find it far more helpful for my process if I hear, “This scene doesn’t feel tense enough” rather than “We need it bluer.”
Do your workflows including remote monitoring?
We have used remote monitoring and remote grading often. While being in the same theater at the same time is great, schedules sometimes make that impossible. If we have a hard deadline but a DP is out of town or, as in a recent case, a client has COVID, we can still do live sessions, and that’s been invaluable. It’s nearly impossible to do a look-setting session or a final polish pass without that live feedback from the client.
What film or show or spot resonates with you from a color perspective?
Recently, I’ve been impressed with the color on the show Poker Face. The look of that show is really special and implies the texture of film without being overly nostalgic. I think texture is something often missing when we think about color. I think there’s an assumption that the texture of a film is decided solely by the camera and lens, but there’s a lot we can do in post. I think that Steve Yedlin and his team on that show have really been pushing the envelope.
However, I like to try to get a lot of influence for color outside of film. I think in the film world, we often just stay in our medium, and I think that this just leads to an endless reference cycle and no new ideas. Lately I’ve been really thinking a lot about the painter Noah Davis. His use of light and color are incredible.
Assimilate’s Jeff Edson and Mazze Aderhold
Assimilate makes color grading and finishing software, as well as compositing tools in HD, 4K, 8K, stereo, HDR, VR and beyond.
What tools do you offer for the color grading pipeline? What about color for virtual production?
Jeff Edson: Assimilate offers a whole range of tools to handle color across the entire production process.
The pipeline starts with Live Looks, our live-grading solution, which can be used to live-grade cameras via SDI, NDI and common LUT boxes, but also virtual production volumes by using the LED processor as a LUT box — thereby changing the colors on the volume in real time.
Live Assist, which is our VTR solution for record and playback as well as quick preview comps on-set, is a super-set of Live Looks and can convert any incoming image signal in terms of color space, transfer curve and creative look.
With Scratch we provide extensive image transforms and color management for dailies processing, finishing and mastering.
By supporting a wide range of color spaces and transfer functions, you can really go from anywhere to anywhere.
Mazze Aderhold: We’ve taken color management to the next level with our Live FX. It can handle the color management for the entire image pipeline on-set. No matter if the content is coming from Unreal or a 2D, 2.5D or 180/360 equirectangular clip, the input can be handled accordingly to be displayed correctly on the LED wall — also in HDR. For that, VP operators can not only use the built-in color management but also any kind of LUT, CDL or any of the grading tools inside Live FX, which all operate in full floating point. Live FX can also control DMX-based stage lighting via ArtNet or sACN. With that, even stage lighting can become fully color space-aware, with the correct tonal responses, according to what is shown on the LED wall.
This is a first in our industry and greatly enhances the illusion created by modern virtual production studios.
How does your solution use AI? Should people be worried about AI or embrace it to help with repetitive tasks?
Edson: Right now, we are not offering any AI-enhanced tools for virtual production inside Live FX. Reason for that being that in many cases, you want 100% reproducible scenarios, which is often not possible with AI — especially on the creative side of things.
In other cases, like color management, AI is not required. Everything is based on a set of 100% defined functions and matrices, which cannot be enhanced any further by throwing some AI onto them.
More generally speaking, though, AI is a double-edged sword. Yes, it can greatly enhance our creativity by taking care of mundane, repetitive tasks, thus leaving more space to be creative. At the same time, AI develops further and further with massive leaps, and that can be worrisome. Current training models are improving at an incredible speed.
We really see AI as being a technology that does not replace storytelling. It advances capabilities.
What is beyond HDR?
Aderhold: That’s a question for TV manufacturers, I believe, as they are typically the driving force behind new display technologies. At the end of the day, however, the goal is to create more realism — copy the real world on a display to the point where you can’t distinguish between the two anymore.
HDR has taken this to a new level in terms of color and dynamic range. Still, looking at the XYZ horseshoe diagram, the P3 color space, which is widely used with HDR, is not even close to covering the entire gamut the human eye can see.
There are technologies being developed that we are also involved in, like 6P Color. It’s a new display technology developed by 6P Color, with research conducted at Baylor University. It effectively introduces three more primaries to the existing three and adapts modern LED display technology with additional LEDs to cover more gamut than current RGB-only displays. Since 6P Color treats color as a colorimetric data point, its Full Color Range system could process any number of additional primaries beyond RGB based on the display technology capabilities.
So I guess the answer is… yet better pixels!
Do you offer color in the cloud?
Edson: We are working closely with Amazon to have our finishing and dailies transcoding hosted on AWS. We can also host Live FX in the cloud, but getting live signals there with zero latency is still challenging. Once technology gets there, however, Live FX will be ready.
Are your users working remotely? Remote monitoring and color?
Aderhold: By now, most of our users are back working on the box directly. However, we do offer remote grading for Scratch and Live FX as well as NDI and direct RTMP output to stream across the web.
With remote grading, artists can connect their grading panel of choice locally, and the commands get sent over to the host via network protocol. The connection itself is easily established via a secret token the user must enter on the remote system to connect to the host.
The remote session is not a one-way street, and it also is not limited to two endpoints. In fact, multiple people around the globe can log onto the remote session simultaneously and work together on a project. For communication’s sake, every participant gets a so-called “digital laser pointer” to point out things in the image. A session has one moderator who can apply changes to the image and an unlimited amount of participants.
The role of the moderator can be passed on to another participant at any point. Grading changes are applied instantly.
If the footage resides on any endpoint, the system will use it. Otherwise the participants will get a high-quality JPEG 2000 stream to look at.
For a more passive remote session, the colorist or VP operator on-set can simply stream out RTMP or an NDI stream of the live signal via Zoom, OBS or any other tool that can take in NDI or a webcam-like signal.
How does your solution embrace ACES workflows?
Edson: Every one of our products fully supports ACES at any point in the pipeline. They can process media on disk as well as incoming live signals in the ACES color space, with the appropriate input and output transforms applied. With that we can, for instance, take in a live SDI signal in Alexa Wide Gamut/ARRI LogC4, convert it into ACEScct and apply a grade in that space before converting it to, say, Rec. 709/gamma 2.4 on the monitor output.
Same for virtual production workflows, where we deal with any kind of footage in any color space imaginable. Whether it’s baked ProRes or NotchLC content or even camera RAW footage, we can move everything into ACES space. From there we can convert to HDR for LED wall output or linear spaces for accurate stage lighting or to SDR for live NDI output — or all at the same time.
What trends are you seeing, and what does the future hold in relationto color and color in virtual production?
Aderhold: Color is becoming a much broader aspect in virtual production. It is not only about the color of the digital content provided anymore. It is also about the color capabilities of the LED wall, the LED processor, the monitoring on-set and, of course, accurate color representation of the stage lighting.
Where it becomes challenging is when you have to connect all the different hardware and software parties in a virtual production workflow into one coherent color pipeline.
This is where we’ve put Live FX at the heart of everything to guarantee a solid workflow that can be handed off properly into post.
The biggest trend right now is improving stage lighting. That is, making it accurately reflect what’s being shown on the volume — and do that in real time. For this, the upcoming version 9.7 of Live FX will have a completely redesigned DMX interface and deeper integration with common lighting consoles.
AJA Video’s Tim Walker
AJA’s ColorBox LUT conversion is used for broadcast, live events and on-set applications.
What tools do you offer for the color grading pipeline?
AJA offers a range of tools to support color-grading pipelines. Our HDR Image Analyzer 12G is a solution for analyzing color that is powered by the Colorfront Engine. You can use HDR Image Analyzer 12G during the grading process to analyze SDR or HDR content and ensure that you’re meeting certain production specs, or you can even load your own LUT to evaluate a particular look. It supports a host of camera log formats and provides visual cues through its analysis tools that help professionals through the final color-grading process.
It also supports the output of its test and measurement features up to 4K resolution and HDR color and offers Waveform Lumi Color. This feature applies the color being measured to the actual waveform, so instead of seeing gray lines on a waveform, you’re seeing the actual color being represented as the waveform.
We also offer AJA ColorBox, which I like to call a “LUT box on steroids.” It has incredibly high image-processing quality and seven nodes of processing, including its 3D LUT processor. Colorists and DPs can load their own looks into ColorBox and quickly play video through it. This allows them to evaluate many looks quickly without having to completely render out content, which can expedite look generation.
Within ColorBox, which can process up to 4K 30p 12-bit RGB over a single wire, there are several features that can be used to help generate looks. A frame store inside the unit supports 16-bit TIFF files and allows users to load in their own reference stills and run them through the AJA color pipeline. Those looks can then be applied downstream, evaluated against reference images and potentially put into a device like HDR Image Analyzer to further examine the look. People could also use this feature on-set to capture reference stills. You can also capture a preprocessed camera log output to be used later in color. Additionally, you could opt to capture the post-processed image in ColorBox and share that as well, streamlining communication between the on-set team and post.
On-set, a tool like ColorBox can receive the log output of a camera on a DIT cart, where a 3D LUT can be applied. It can be converted from the camera log color space to a display format with that look applied. From there, the DP and DIT can gain control and manipulate the look through third-party apps like Pomfort Livegrade Pro and Assimilate Live Looks and adjust the color decision list to modify the look to fit the scene they’re working on. That information can then be shared with the dailies team. This approach ultimately benefits the post process, where on-set looks can be provided to the post team to refer to as a starting point for the grade established on-set.
AJA also develops mobile and desktop video I/O solutions via our Kona and Io product lines that can be used in the color-grading pipeline with third-party products that need to get SDI in and out of popular color-grading applications.
What is beyond HDR?
It’s challenging to look at or define what is beyond HDR because HDR itself is still so new to the color-processing pipeline. There is room for further education, refinement and growth, especially as more productions embrace it and HDR moves closer to the set. The industry as a whole needs to get more comfortable with HDR from on-set through editorial and final finishing before we contemplate what is “beyond HDR.” Hopefully, tools like ColorBox make HDR more accessible to more pros in an economical and meaningful way to help move the industry forward.
Are your users working remotely?
We’ve seen our products used in many ways for remote workflows. For instance, outputs from editing or color suites might be connected to an AJA Helo Plus H.264 streamer/recorder and live-streamed to another creative decision-maker across the world. For facilities looking to stream multiple HD workstation outputs or 4K/Ultra HD outputs with remote creatives and clients, they could also employ a solution like Bridge Live.
Also, for post teams that require critical monitoring insight, HDR Image Analyzer 12G offers multiple options for remote use and configuration via a built-in web-based interface that allows users to remotely access every configurable parameter of the device via Remote Desktop Protocol, which makes the entire test and measurement device available to remote users.
How does your solution embrace ACES workflows?
AJA ColorBox supports ACES workflows via third-party technology integrations. It can be used with tools like Pomfort Livegrade Pro and Assimilate Live Looks, which manage conversions into and out of ACES. ColorBox assists with look management, with the processing taking place in the ACES color space. FS-HDR is another AJA product that’s being used on-set. Through the Colorfront Engine and Film Mode, users can put the product into an ACES grading space and complete grading in ACES using the Colorfront Engine controlled within FS-HDR or via Livegrade. Supporting ACES is a priority, and you can expect more from us on this front in the future.
What trends are you seeing, and what does the future hold in relation to color?
We’re going to continue hearing more about Dolby Vision. The HDR format has quickly become available across consumer displays, providing compelling viewing experiences that audiences have come to expect. As more distributors have begun to embrace the format, there are conversations within the creative community about the benefits of having earlier access to Dolby Vision in the color-grading pipeline, like on-set. Because it impacts the final look of content and the viewer experience, there’s a need to be able to manage and preserve the creative intent earlier in the color-grading pipeline and ensure everything looks as it should.
Alongside the continued rise of HDR, color conversion and management are rapidly evolving. There are so many different approaches to and products for managing color and color conversion, depending upon the type of production, the director’s vision, the DIT and more. We’ve seen several products roll out with color management built in during the last few years, but there’s still demand within the industry to ensure more consistency in how HDR looks are managed and converted across the chain.
Then there is virtual production, which is a great advancement that simplifies many aspects of a production. You can shoot multiple locations all day versus obtaining multiple permits to shoot elsewhere, and you’re not relegated to shooting within a set window due to natural elements like the sunset. It also introduces new opportunities with elements like previsualization and color management. There is now a need to match the LED volume graphics in the virtual production to design elements in the foreground.
Currently, LED processors have some color-correction capabilities, but putting that burden on the LED processor may not be the most efficient path. Re-rendering graphics is another option, which might be an acceptable approach. This is where technology like AJA ColorBox can be helpful, as DITs and DPs could use it on-set to color-correct or better color-match the LED volume colors to on-set items, ensuring more seamless blending of the foreground and LED volume.
We’re also going to see more HDR on-set and, with it, increased demand for color management to move on-set. While on-set look management with DITs is more common in US productions, it is still catching on in other parts of the world and presents a growing opportunity internationally. Recognizing the importance of the DIT, major content providers are training people in the application and role. As this trend continues, it’s going to become even more common, and the technologies and workflows will become more accessible.
All these trends will ultimately make for more dynamic content that aligns with the creator’s vision and intent. Whichever way you slice it, the future’s looking bright.
MTI’s Tanner Buschman
MTI Film works on television, features and feature restorations. Recent Buschman projects include Sugar, American Horror Story and Walker: Independence.
What does your setup look like tools wise?
We use Nucoda for color and Sony X300 as our reference monitor.
Do you work in the cloud? If so, can you describe that workflow and the benefits?
Pertaining to color, everything is local, although our clients use cloud services for remote review. We use the cloud more on the edit side.
Do you often create LUTs for a project? How does that help?
I often create LUTs or modify existing LUTs. When I start a project, it’s important to get the look locked in from day one. That’s what makes camera tests so important from a color perspective. The best-case scenario is when we can dial in the looks of the show and create a show LUT, or series of LUTs, that reflects the final intent of the DP. The DP can then shoot through that LUT and keep cohesiveness on-set. The LUT can also be applied to dailies so that the creative editors and others can see the intent and not be confused when, for example, a warm scene becomes a cold scene. When you have a dialed-in show LUT, you can focus more on the creative and final color adjustments.
Do you use AI as part of your daily job?
Not right now.
If given an example from another show or piece of art, what is the best way to emulate that?
If a client says, “I want to recreate a film stock,” or “I want to recreate the look of a certain film,” I’ll bring it into my box, look at it in the scopes and see what’s going on. I can then emulate it by lining up the reference media to the scopes. Scopes turn imagery into mathematical expressions. They read information truthfully, unlike the human eye. Our eyes balance color naturally, especially when you are comparing images. Scopes are absolute.
Do your workflows include remote monitoring?
I do client sessions with people around the world — Uruguay, France and Italy all just this year. One of my clients travels frequently, and I’ll do color sessions with him wherever he happens to be. We use Teradek Core Cloud and Sohonet ClearView. I can feed them high-quality imagery and stream it to the client. We’ll then carry on a conversation by phone and have a color review no different than if he was sitting beside me in the bay.
What film or show or spot resonates with you from a color perspective?
I liked Joker. It was colored by Jill Bogdanowicz. She’s done quite a few good things. The color grade was incredibly stylized. Often when people talk about grades, they’re referring to art direction or cinematography. Joker was graded. It was not a generic look. The other show I’d mention is The Boys, Season 3 — also very impressive.
OkayStudio Colorist Alex O’Brien
OkayStudio is a post house based in Dalston, London. It offers color, offline and visual effects. They work on a variety of projects, predominantly commercial but also short-form content, documentaries, feature films and music videos. O’Brien works mainly from the London studio, but the company also has a color suite in Berlin.
What does your setup look like tools wise?
I use Resolve, an Eizo CG3146 monitor and the Mac Studio M2.
Do you work in the cloud? If so, can you describe that workflow and the benefits?
Working in the cloud is something we considered, but we’re very happy with our current workflow… although the cloud is something we may revisit in future.
Do you often create LUTs for a project? How does that help?
LUTs can be incredibly useful when working with VFX in tandem. We can do an initial look session with clients and then export the LUTs for the VFX artists to apply. This has creative benefits for the clients and can outline any potential issues with the grade before the main session after VFX.
Do you use AI as part of your daily job? In what way? Helping with repetitive tasks?
Not currently, but I am always looking for ways to expand workflows.
If given an example from another show or piece of art, what is the best way to emulate that?
My starting point would be to sit with the director/DP and go through each reference and find out what they’re drawn to. Texture? Tone? Color? We can then build a look and work with the footage and references we have to create a world that is unique but draws on the creative inspiration.
Do your workflows include remote monitoring?
At OkayStudio we offer ClearView for remote attendance, but our preference is always to have clients in the room. I’ve enjoyed the flexibility that remote working offers, as it has given me the opportunity to work with directors and DPs globally.
What film or show or spot resonates with you from a color perspective?
A recent favorite film would have to be Aftersun, graded by Kath Raisch. I love the naturalistic tone of the grade and the nostalgia it brings to the film. It resonates with me, as I really believe a subtle and dialed-back grade can sometimes add more. Another current favorite is the new Yorkshire Tea spot, graded by Hannibal Lang. Colorwise, I love the tone and palette — it’s really fun. It’s also just a really amazing bit of work. Go watch it!