By Barry Goch
The Academy of Motion Pictures Arts & Sciences has awarded Curtis Clark, ASC, the John A. Bonner Medallion “in appreciation for outstanding service and dedication in upholding the high standards of the Academy.” The presentation took place in early February and just prior to the event, I spoke to Clark and asked him to reflect on the transition from film to digital cinema and his contributions to the industry.
Clark’s career as a cinematographer includes features, TV and commercials. He is also the chair of the ASC Motion Imaging Technology Council that developed the ASC- CDL.
Can you reflect on the changes you’ve seen over your career and how you see things moving ahead in the future?
Once upon a time, life was an awful lot simpler. I look back on it nostalgically when it was all film-based, and the possibilities of the cinematographer included follow-up on the look of dailies and also follow through with any photographic testing that helped to hone in on the desired look. It had its photochemical limitations; its analog image structure was not as malleable or tonally expansive as the digital canvas we have now.
Do you agree that Kodak’s Cineon helped us to this digital revolution — the hybrid film/digital imaging system where you would shoot on film, scan it and then digitally manipulate it before going back out to film via a film recorder?
That’s where the term digital intermediate came into being, and it was an eye opener. I think at the time not everyone fully understood the ramifications of the sort of impact it was making. Kodak created something very potent and led the way in terms of methodologies, or how to arrive at integration of digital into what was then called a hybrid imaging system —combining digital and film together.
The DCI (Digital Cinema Initiative) was created to establish digital projection standards. Without a standard we’d potentially be creating chaos in terms of how to move forward. For the studios, distributors and exhibitors, it would be a nightmare Can you talk about that?
In 2002, I had been asked to form a technology committee at the ASC to explore these issues: how the new emerging digital technologies were impacting the creative art form of cinematography and of filmmaking, and also to help influence the development of these technologies so they best serve the creative intent of the filmmaker.
DCI proposed that for digital projection to be considered ready for primetime, its image quality needed to be at least as good as, if not better than, a print from the original negative. I thought this was a great commitment that the studios were making. For them to say digital projection was going to be judged against a film print projection from the original camera negative of the exact same content was a fantastic decision. Here was a major promise of a solution that would give digital cinema image projection an advantage since most people saw release prints from a dupe negative.
Digital cinema had just reached the threshold of being able to do 2K digital cinema projection. At that time, 4K digital projection was emerging, but it was a bit premature in terms of settling on that as a standard. So you had digital cinema projection and the emergence of a sophisticated digital intermediate process that could create the image quality you wanted from the original negative, but projected on a digital projection.
In 2004, the Michael Mann film Collateral film was shot with the Grass Valley Viper Film Stream, the Sony F900 and Sony F950 cameras, the latest generation of digital motion picture cameras — basically video cameras that were becoming increasingly sophisticated with better dynamic range and tonal contrast, using 24fps and other multiple frame rates, but 24p was the key.
These cameras were used in the most innovative and interesting manner, because Mann combined film with digital, using the digital for the low-light level night scenes and then using film for the higher-light level day exterior scenes and day interior scenes where there was no problem with exposure.
Because of the challenge of shooting the night scenes, they wanted to shoot at such low light levels that film would potentially be a bit degraded in terms of grain and fog levels. If you had to overrate the negative, you needed to underexpose and overdevelop it, which was not desirable, whereas the digital cameras thrived in lower light levels. Also, you could shoot at a stop that gave you better depth of field. At the time, it was a very bold decision. But looking back on it historically, I think it was the inflection point that brought the digital motion picture camera into the limelight as a possible alternative to shooting on film.
That’s when they decided to do Camera Assessment Series tests, which evaluates all the different digital cinema cameras available at the time?
Yeah, with the idea being that we’d never compare two digital cameras together, we’d always compare the digital camera against a film reference. We did that first Camera Assessment Series, which was the first step in the direction of validating the digital motion picture camera as viable for shooting motion pictures compared with shooting on film. And we got part way there. A couple of the cameras were very impressive: the Sony F35, the Panavision Genesis, the Arri D21 and the Grass Valley Viper were pretty reasonable, but this was all still mainly within a 2K (1920×1080) realm. We had not yet broached that 4K area.
A couple years later, we decided to do this again. It was called the Image Control Assessment Series, ICAS. That was shot at Warner Bros. It was the scenes that we shot in a café — daylight interior and then night time exterior. Both scenes had a dramatically large range of contrast and different colors in the image. It was the big milestone. The new Arri Alexa was used, along with the Sony F65 and the then latest versions of the Red cameras.
So we had 4K projection and 4K cameras and we introduced the use of ACES (Academy Color Encoding System) color management. So we were really at the point where all the key components that we needed were beginning to come together. This was the first instance where these digital workflow components were all used in a single significant project testing. Using film as our common benchmark reference — How are these cameras in relation to film? That was the key thing. In other words, could we consider them to be ready for prime time? The answer was yes. We did that project in conjunction with the PGA and a company called Revelations Entertainment, which is Morgan Freeman’s company. Lori McCreary, his partner, was one of the producers who worked with us on this.
So filmmakers started using digital motion picture cameras instead of film. And with digital cinema having replaced film print as a distribution medium, these new generation digital cameras started to replace film as an image capture medium. Then the question was would we have an end-to-end digital system that would become potentially viable as an alternative to shooting on film.
Part of the reason you are getting this acknowledgement from the Academy is your dedication on the highest quality of image and the respect for the artistry, from capture through delivery. Can you talk about your role in look management from on-set through delivery?
I think we all need to be on the same page; it’s one production team whose objective is maintaining the original creative intent of the filmmakers. That includes director and cinematographer and working with an editor and a production designer. Making a film is a collective team effort, but the overall vision is typically established by the director in collaboration with the cinematographer and a production designer. The cinematographer is tasked with capturing that with lighting, with camera composition, movement, lens choices — all those elements that are part of the process of creative filmmaking. Once you start shooting with these extremely sophisticated cameras, like the Sony F65 or Venice, Panavision Millennium DXL, an Arri or the latest versions of the Red camera, all of which have the ability to reproduce high dynamic range, wide color gamut and high resolution. All that raw image data is inherently there and the creative canvas has certainly been expanded.
So if you’re using these creative tools to tell your story, to advance your narrative, then you’re doing it with imagery defined by the potential of what these technologies are able to do. In the modern era, people aren’t seeing dailies at the same time, not seeing them together under controlled circumstances. The viewing process has become fragmented. When everyone had to come together to view projected dailies, there was a certain camaraderie constructive contributions that made the filmmaking process more effective. So if something wasn’t what it should be, then everyone could see exactly what it was and make a correction if you needed to do that.
But now, we have a more dispersed production team at every stage of the production process, from the initial image capture through to dailies, editorial, visual effects and final color grading. We have so many different people in disparate locations working on the production who don’t seem to be as unified, sometimes, as we were when it was all film-based analog shooting. But now, it’s far easier and simpler to integrate visual effects into your workflow. Like Cineon indicated when it first emerged, you could do digital effects as opposed to optical effects and that was a big deal.
So coming back to the current situation, and particularly now with the most advanced forms of imaging, which include high dynamic range, wider color gamut, wider than even P3, REC 2020, having a color management system like ACES that actually has enough color gamut to be able to contain any color space that you capture and want to be able to manipulate.
Can you talk about the challenges you overcame, and how that fits into the history of cinema as it relates to the Academy recognition you received?
As a cinematographer, working on feature films or commercials, I kept thinking, if I’m fortunate enough to be able to manage the dailies and certainly the final color grading, there are these tools called lift gain gamma, which are common to all the different color correctors. But they’re all implemented differently. They’re not cross-platform-compatible, so the numbers from a lift gain gamma — which is the primary RGB grading — from one color corrector will not translate automatically to another color corrector. So I thought, we should have a cross platform version of that, because that is usually seen as the first step for grading.
That’s about as basic as you can get, and it was designed so that it would be a cross-platform implementation, so that everybody who installs and applies the ASC-CDL in a color grading system compatible with that app, whether you did it on a DaVinci, Baselight, Lustre or whatever you were using, the results would be the same and transferable.
You could transport those numbers from one set-up on set using a dailies creation tool, like ColorFront for example. You could then use the ASC CDL to establish your dailies look during the shoot, not while you’re actually shooting, but with the DIT to establish a chosen look that could then be applied to dailies and used for VFX.
Then when you make your way into the final color grading session with the final cut — or whenever you start doing master color grading going back to the original camera source — you would have these initial grading corrections as a starting point as references. This now gives you the possibility of continuing on that color grading process using all the sophistication of a full color corrector, whether it’s power windows or secondary color correction. Whatever you felt you needed to finalize the look.
I was advocating this in the ASC Technology Committee, as it was called, now subsequently renamed the Motion Imaging Technology Council (MITC). We needed a solution like this and there were a group of us who got together and decided that we would do this. There were plenty of people who were skeptical, “Why would you do something like that when we already have lift gain gamma? Why would any of the manufacturers of the different color grading systems integrate this into their system? Would it not impinge upon their competitive advantage if they had a system that people were used to using, and if their own lift gain gamma would work perfectly well for them, why would they want to use the ASC CDL?
We live in a much more fragmented post world, and I saw that becoming even more so with the advances of digital. The ASC CDL would be a “look unifier” that would establish initial look parameters. You would be able to have control over the look at every stage of the way.
I’m assuming that the cinematographer would work with the director and editor, and they would assess certain changes that probably should be made because we’re now looking at cut sequences and what we had thought would be most appropriate when we were shooting is now in the context of an edit and there may need to be some changes and adjustments.
Were you involved in ACES? Was it a similar impetus for ACES coming about? Or was it just spawned because visual effects movies became so big and important with the advent of digital filmmaking?
It was bit of both, including productions without VFX. So I would say that initially it was driven by the fact that there really should be a standardized color management system. Let me give you an example of what I’m talking about. When we were all photochemical and basically shooting with Kodak stock, we were working with film-based Kodak color science.
It’s a color science that everybody knew and understood, even if they didn’t understand it from an engineering photochemical point of view, they understood the effects of it. It’s what helps enable the look and the images that we wanted to create.
That was a color management system that was built into film. That color science system could have been adapted into the digital world, but Kodak resisted that because of the threat to negatives. If you apply that film color science to digital cameras, then you’re making digital cameras look more like film and that could pose a threat to the sale of color film negative.
So that’s really where the birth of ACES came about — to create a universal, unified color management system that would be appropriate anywhere you shot and with the widest possible color gamut. And it supports any camera or display technology because it would always have a more expanded (future proofing) capability within which the digital camera and display technologies would work effectively and efficiently but accurately, reliably and predictably.
Very early on, my ASC Technology Committee (now called Motion Imaging Technology Council) got involved with ACES development and became very excited about it. It was the missing ingredient needed to be able to make the end-to-end digital workflow the success that we thought that it could become. Because we no longer could rely on film-based color science, we had to either replicate that or emulate it with a color management system that could accommodate everything we wanted to do creatively. So ACES became that color management system.
So, in addition to becoming the first cross-platform primary color grading tool, the ASC CDL became the first official ACES look modification transform. Because ACES is not a color grading tool, it’s a color management system, you have to have color grading tools with color management. So you have the color management with ACES, you have the color grading with ASC CDL and the combination of those together is the look management system because it takes them all to make that work. And it’s not that the ASC CDL is the only tool you use for color grading, but it has the portable cross-platform ability to be able to control the color grading from dailies through visual effects up to the final color grade when you’re then working with a sophisticated color corrector.
What do you see for the future of cinematography and the merging of the worlds of post and on-set work and, what do you see as future challenges for future integrations between maintaining the creative intent and the metadata.
We’re very involved in metadata at the moment. Metadata is a crucial part of making all this work, as you well know. In fact, we worked on the common 3D LUT format, which we worked on with the Academy. So there is a common 3D LUT format that is something that would again have cross-platform consistency and predictability. And it’s functionality and its scope of use would be better understood if everyone were using it. It’s a work in progress. Metadata is critical.
I think as we expand the canvas and the palette of the possibility of image making, you have to understand what these technologies are capable of doing, so that you can incorporate them into your vision. So if you’re saying my creative vision includes doing certain things, then you would have to understand the potential of what they can do to support that vision. A very good example in the current climate is HDR.
That’s very controversial in a lot of ways, because the set manufacturers really would love to have everything just jump off the screen to make it vibrant and exciting. However, from a storytelling point of view, it may not be appropriate to push HDR imagery where it distracts from the story.
Well, it depends on how it’s done and how you are able to use that extended dynamic range when you have your bright highlights. And you can use foreground background relationships with bigger depth of field for tremendous effect. They have a visceral presence, because they have a dimensionality when, for example, you see the bright images outside of a window.
When you have an extended dynamic range of scene tones that could add dimensional depth to the image, you can choreograph and stage the blocking for your narrative storytelling with the kind of images that take advantage of those possibilities.
So HDR needs to be thought of as something that’s integral to your storytelling, not just something that’s there because you can do it. That’s when it can become a distraction. When you’re on set, you need a reference monitor that is able to show and convey, all the different tonal and color elements that you’re working with to create your look, from HDR to wider color gamut, whatever that may be, so that you feel comfortable that you’ve made the correct creative decision.
With virtual production techniques, you can incorporate some of that into your live-action shooting on set with that kind of compositing, just like James Cameron started with Avatar. If you want to do that with HDR, you can. The sky is the limit in terms of what you can do with today’s technology.
So these things are there, but you need to be able to pull them all together into your production workflow to make sure that you can comfortably integrate in the appropriate way at the appropriate time. And that it conforms to what the creative vision for the final result needs to be and then, remarkable things can happen. The aesthetic poetry of the image can visually drive the narrative and you can say things with these images without having to be expositional in your dialogue. You can make it more of an experientially immersive involvement with the story. I think that’s something that we’re headed toward, that’s going to make the narrative storytelling very interesting and much more dynamic.
Certainly, and certainly with the advancements of consumer technology and better panels and the high dynamic range developments, and Dolby Vision coming into the home and Atmos audio coming into the home. It’s really an amazing time to be involved in the industry; it’s so fun and challenging.
It’s a very interesting time, and a learning curve needs to happen. That’s what’s driven me from the very beginning and why I think our ASC Motion Imaging Technology Council has been so successful in its 16 years of continuous operation influencing the development of some of these technologies in very meaningful ways. But always with the intent that these new imaging technologies are there to better serve the creative intent of the filmmaker. The technology serves the art. It’s not about the technology per se, it’s about the technology as the enabling component of the art. It enables the art to happen. And expands it’s scope and possibility to broader canvases with wider color gamuts in ways that have never been experienced or possible before.
Barry Goch is a Finishing Artist at The Foundation and a Post Production Instructor at UCLA Extension. You can follow him on Twitter at @gochya.