Tag Archives: colorist Tashi Trieu

Tashi Trieu

Avatar: The Way of Water Colorist Tashi Trieu on Making the Grade

By Randi Altman

Working in post finishing or 10 years, colorist Tashi Trieu also has an extensive background in compositing as well as digital and film photography. He uses all of these talents while working on feature films (Bombshell), spots (Coke Zero) and episodics (Titans). One of his most recent jobs was as colorist on the long-awaited Avatar follow-up, Avatar: The Way of Water, which has been nominated for a Best Picture Oscar.

Tashi Trieu

We reached out to Trieu, who has a long relationship with director James Cameron’s production company Lightstorm Entertainment, to learn more about how he got involved in the production and his workflow.

We know James Cameron has been working on this for years, but how early did you get involved on the film, and how did that help?
I was loosely involved in preproduction after we finished Alita [produced by Cameron and Jon Landau] in early 2019. I was the DI editor on that film. I looked at early stereo tests with the DP Russell Carpenter [ASC], and I was blown away by the level of precision and specificity of those tests.

Polarized reflections are a real challenge in stereo as they result in different brightnesses and textures between the eyes that degrade the stereo effect. I remember them testing multiple swatches of black paint to find the one that retained the least polarization. I had never been a part of such detailed camera tests before.

What were some initial directions that you got from DP Russell Carpenter and director Cameron? What did they say about how they wanted the look to feel?
Jim impressed on me the importance of everything feeling “real.” The first film was photographic and evoked reality, but this had to truly embody it photorealistically.

Avatar: The Way of WaterWas there a look book? How do you prefer a director or DP to share their looks for films?
They didn’t share a look book with me on this one. By the time I came onboard (October 2021), WetaFX was deep into their work. For any given scene, there is usually a key shot that really shines and perfectly embodies the look and intention Jim’s going for and that often served as my reference. I needed to give everything else that extra little push to elevate to that level.

Did they want to replicate the original or make it slightly different? The first one takes place mostly in the rain forest, but this one is mostly in water. Any particular challenges that went along with this?
Now that the technology has progressed to a point where physically based lighting, subsurface scattering and realistic hair and water simulations are possible on a scale as big as this movie, the attention to photorealism is even more precise. We worked a lot on selling the underwater scenes in color grading. It’s important that the water feel like a realistic volume.

People on Earth haven’t been to Pandora, but a lot of people have put their head underwater here at home. Even in the clearest Caribbean water, there is diffusion, scattering and spectral filtering that occur. We specifically graded deeper water bluer and milked out murkier surface conditions when it felt right to sell that this is a real, living place.

This was done just using basic grading tools, like lift and gamma to give the water a bit of a murky wash.

The film was also almost entirely a visual effect. How did you work with these shots?
We had a really organized and predictable pipeline for receiving, finalizing and grading every shot in the DI. For as complex and daunting as a film like this can be, it was very homogeneous in process. It had to be, otherwise it could quickly devolve into chaos.

Every VFX shot came with embedded mattes, which was an incredible luxury that allowed me to produce lightning-fast results. I’d often combine character mattes with simple geometric windows and keys to rapidly get to a place that in pure live-action photography would have required much more detailed rotoscoping and tracking, which is only made more difficult in stereo 3D.

Did you create “on-set” LUTs? If so, how similar were those to the final look?
I took WetaFX’s lead on this one. They were much closer to the film early on than I was and spent years developing the pipeline for it. Their LUT was pretty simple, just a matrix from SGamut3.Cine to something just a little wider than P3 to avoid oversaturation, and a simple S-Curve.

Usually that’s all you need, and any scene-specific characteristics can be dialed in through production design, CGI lighting and shaders or grading. I prefer a simpler approach like this for most films — particularly on VFX films, rather than an involved film-emulation process that can work 90% of the time but might feel too restrictive at times.

WetaFX built the base LUT and from there I made several trims and modifications for various 3D light-levels and Dolby Cinema grades.

Tashi Trieu

Park Road Post

Where were you based while working on the film, and what system did you use? Any tools in that system come in particularly handy on this one?
I’m normally in Los Angeles, but for this project I moved to Wellington, New Zealand for six months. Park Road Post was our home base and they were amazing hosts.

I used Blackmagic DaVinci Resolve 18 for the film. No third-party plugins, just out-of-the-box stuff. Resolve’s built-in ResolveFX tools keep getting more and more powerful, and I used them a lot on this film. Resolve’s Python API was a big part of our workflow and streamlined shot-ingest and added a lot of little quality-of-life improvements to our specific workflow.

How did your workflow differ, if at all, from a traditionally shot film?
Most 3D movies are conversions from 2D sources. In that workflow, you’re spending the majority of your time on the 2D version and then maybe a week at the end doing a trim grade for 3D.

On a natively 3D movie that is 3D in both live-action and visual effects production, the 3D is given the proper level of attention that really makes it shine. When people come out of the theater saying they loved the 3D, or that they “don’t” have a headache from the 3D and they’re surprised by that, it’s because it’s been meticulously designed for years to be that good.

In grading the film, we do it the opposite way the conversion films do. We start in 3D and are in 3D most of the time. Our primary version was Dolby Cinema 3D 14fL in 1.85:1 aspect ratio. That way we’re seeing the biggest image on the brightest screen. Our grading decisions are influenced by the 3D and done completely in that context. Then later, we’d derive 2D versions and make any trims we felt necessary.

Tashi TrieuThis film can be viewed in a few different ways. How did your process work in terms of the variety of deliverables?
We started with a primary grading version, Dolby Cinema 3D 14fL. Once that was dialed in and the bulk of the creative grading work was done, I’d produce a 3.5fL version for general exhibition. That version is challenging, but incredibly important. A lot of theaters out there aren’t that bright, and we still owe those audiences an incredible experience.

As a colorist, it’s always a wonderful luxury to have brilliant dynamic range at your fingertips, but the creative constraint of 3.5fL can be pretty rewarding. It’s tough, but when you make it work it’s a bit of an accomplishment. Once I have those anchors on either end of the spectrum, I can quickly derive intermediate light levels for other formats.

The film was released in both 1.85:1 and 2.39:1, depending on each individual theater’s screen size and shape to give the most impact. On natively cinema-scope screens, we’d give them the 2.39:1 version so they would have the biggest and best image that can be projected in that theater. This meant that from acquisition through VFX and into the DI, multiple aspect ratios had to be kept in mind.

The crew!

Jim composed for both simultaneously while filming virtual cameras as well as live action.

But there’s no one-size-fits-all way to do that, so Jim did a lot of reframing in the DI to optimize each of the two formats for both story and aesthetic composition. Once I had those two key light-levels and framing for the two aspect ratios, I built out the various permutations of the two, ultimately resulting in 11 simultaneous theatrical picture masters that we delivered to distribution to become DCPs.

Finally, what was the most memorable part of working on Avatar: The Way of the Water from a work perspective?
Grading the teaser trailer back in April and seeing that go live was really incredible. It was like a sleeping giant awoke and announced, “I’m back” and everybody around the world and on the internet went nuts for it.

It was incredibly rewarding to return to LA and take friends and family to see the movie in packed theaters with excited audiences. It was an amazing way for me to celebrate after a long stint of challenging work and a return to movie theaters post-pandemic.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 25 years.