NBCUni 9.5.23

Category Archives: DIT

Look Studio

FutureWorks Adds Look Studio Color-Testing Hub

Indian post production and rental studio FutureWorks Media has launched Look Studio. Housed within FutureWorks’ color facility, Look Studio is a creative hub where solutions are tested, integrated and perfected before being deployed on location.

With over a decade of development in imaging, FutureWorks supports cinematographers and directors in realizing their visions through the company’s color and post services. The company united its color services and camera rental division to streamline its ability to deliver color-balanced dallies to editorials through its DIT services, including on-set monitoring of both HDR and SDR simultaneously.

Look Studio builds on this effort, further evolving FutureWorks into a full-service imaging studio, where color, cameras, lenses and VFX play mutually pivotal roles.

The studio features a 35-foot by 20-foot overall production space and a 14-foot-high, full-ceiling grid. Using live grading carts and data management setups connected to the facility’s main color-grading suites, cinematographers can test cameras and lenses and build looks and LUTs alongside FutureWorks colorists. They can also review their captured tests on DCI projection and HDR and SDR broadcast monitors right inside the Baselight color grading suites, which are also equipped with Colorfront Transkoder. This facility ensures thorough testing of the entire color workflow, covering capture, data, conform, grade and mastering and QC.

Filmmakers have access to Foundry’s Nuke and Katana; Autodesk’s Maya and Arnold; and SideFX Houdini. They can also test just about all the cameras in FutureWorks’ rental catalog, including ARRI Alexa 35, ARRI Alexa SXT-W, ARRI Alexa Mini, ARRI Alexa LF, ARRI Alexa Mini LF, Red Helium 8K, Red V-Raptor 8K VV, Red V-Raptor XL 8K VV, Red Monstro 8K VV, Sony Venice 6K (with high-speed option) and the Sony Venice Rialto extension system.

“Our commitment to investing in the training and deployment of colorists alongside cinematographers, along with trained imaging technicians on location, has yielded positive results for FutureWorks,” says co-founder Gaurav Gupta. “We’re continually pushing the boundaries of imaging, providing comprehensive solutions that seamlessly blend technical excellence with creative freedom.”

Nobe OmniScope Updated: Colorist, DIT and VP Workflows

Software company Time in Pixels has released a new version of its Nobe OmniScope, a video scope app targeting colorists and digital imaging technicians.  This newly updated app and plugin is available with support for Blackmagic DaVinci Resolve, Adobe Premiere Pro, Avid Media Composer, Final Cut Pro and other video editing and color grading tools.

New features include a reworked color processing pipeline and new OCIO update. A new QC timeline detects signal issues, the HDR statistics MaxFall and MaxCLL, blanking errors and HDR gamut status. The new multi-input support brings time savings when switching between video signals and devices.

Additionally, OmniScope can now be integrated with Unreal Engine and Unity for virtual production monitoring via GPU memory sharing (with Syphon and Spout support).

Nobe OmniScope is available now on macOS and Windows.

Snapshot of updates to October 2023 release: 

  • Enhanced OpenColor IO color management
  • Advanced HDR quality control and new QC timeline
  • Streamlined multi-input support
  • New virtual production workflows for Unreal Engine and Unity
  • Performance boosts and macOS optimization.
  • 14 highly customizable scope types and custom workspaces, including HSL vectorscope, false color, 3D color cube, histograms and more
  • HDR support and 12-bit signal processing
  • Quality control with blanking detection and error logging
  • Multi-source signal inputs including NDI, Stream Deck and DeckLink
  • Stand-alone application includes optional plugin for Resolve, Premiere and After Effects.

The new update is free for OmniScope owners on an active upgrade and support plan. The Photo version costs $99, the video version is $355 and the Pro version is $399.

Time in Pixels, which was founded by lead developer Tom Huczek, is offering a free two-week trial.

OmniScope updates are based on user requests. For example, DI colorist Tashi Trieu used OmniScope on Avatar: The Way of Water. “Tom is so responsive to my requests that none of them linger very long. Stuff that I’d normally expect to wait a lengthy development cycle for sometimes comes within days of my request.”

 

 

NBCUni 9.5.23
Scoop

FutureWorks Uses Next-Gen Color Workflow on Netflix’s Scoop

Mumbai, India’s FutureWorks created a new pipeline for Netflix India courtroom drama Scoop, taking imaging data all the way from the set to the edit suite. As well as creating a new workflow that boosts efficiency while ensuring quality visuals, FutureWorks also covered the entire picture post process and rental services on Scoop – including dallies, the online, grade, VFX and finish.

Produced by Matchbox Shots, the Hindi-language series was directed by Hansal Mehta, with Pratham Mehta serving as director of photography and Michele Ricossa as lead colorist. The show follows a prominent crime journalist implicated in the murder of a rival reporter.

ScoopFutureWorks began to develop this workflow following its live color grading work on the 2022 film Jersey. Based on the challenges experienced on Jersey and taking advantage of the lull in productions during the pandemic, the studio started to evolve its DIT process to enable the team to work more efficiently. This is particularly important on-set. The aim was to empower the colorist to work with the DP and director while they’re still on set, so that any issues could be flagged before reaching the edit suite.

“We needed a process that would support everybody,” says Rahul Purav, head of color at FutureWorks. “So, we started to think about extending the imaging process beyond just color. We focused on creating an on-set monitoring process, as well as QC.”

“With Rahul, we managed the workflow of the DIT setup prior to the shooting,” explains Ricossa. “I was on set the first two or three days of the shoot to check with the DP and the DIT team on how to work on the dailies and if the look was working as intended. After that, I had a short session at FutureWorks to review some of the footage on the HDR setup. A few weeks later I went on-set one more time to a different location to check if everything was holding up properly look wise. Then, I reviewed the dailies on the private cloud streaming service, giving minor notes to the DP and to the DIT team. Everything went smoothly.”

Ricossa graded Scoop using FilmLight’s colorspace T-log/E-gamut. Only VFX shots were converted in ACES to facilitate the VFX workflow. “The look of the show, and the grade of individual clips later on, was shared with the VFX team thanks to the Baselight’s BLG system. When possible, we had a few back and forths between the DI and the VFX to fix issues and get the best out of it,” explains Ricossa. “The most challenging part of the show was to match some of the stock footage. Baselight’s tools and color management system helped a lot to achieve the grade I had in mind.”

In FutureWorks’ workflow, everything is done remotely. During shooting, the systems record and monitor video signals wirelessly. This means that there’s no distraction or interruption for the cinematographer, but when required, they can talk to FutureWorks’ on-set DIT who has a studio-grade monitor under their control. This helps the directorial team to verify that everything they’re shooting is correct, while they’re still on-set. This includes color, but also extends to other areas like lenses, focus, and exposure levels.

Transcoding is monitored throughout the process to highlight any areas of concern. If there are issues, such as reflections or unwanted props in the shot, these can be dealt with at the time of shooting or flagged for fixing by the VFX team. Everything is captured as an Movie file with embedded metadata so that all of the data from the shoot ends up with editorial. “It’s like there’s a third eye watching you and helping you while you’re shooting and editing,” says Rahul. “As a colorist, I think it’s imperative that everybody in the chain is aware of what’s happening on the shoot, right from the beginning to the very end. This makes communication much more efficient, as notes from the cinematographer can be embedded into the metadata of the particular shot, which is very helpful later on in the process.”

This new process was absolutely key for the shoot on Scoop, which lasted for 100 days. A team of four people tested the system first, with FutureWorks having since streamlined the crew to three — one experienced DIT technician for on-set QC and another two for data management. All team members are very experienced and have trained for a long time so that they can integrate with each other on the shoot, ensuring that all the necessary data is captured and transcoded.

“When you take that experience on location, it’s an asset to the cinematographer, the director, and the production as a whole,” explains Purav. “Throughout the shoot on Scoop, the director and cinematographer continually came over to verify shots on the imaging cart, demonstrating that our new pipeline is already proving to be useful for the directorial team.”

The revamped pipeline — which had to meet the specifications required by Netflix productions — includes Livegrade Studio, Codex and Silverstack for transcoding, and FilmLight Daylight for rendering dailies. One of the key challenges in implementing the new workflow was understanding the protocols of each camera. If certain protocols didn’t work with the new system, the team had to find different ways to sync the data. FutureWorks also collaborated closely with manufacturers and vendors, including Sony and Codex, to troubleshoot any problems.

“While we had a few teething problems initially, we were able to work them out within a couple of days, and it was smooth sailing from that point on,” says Purav.

Since its successful debut on Scoop, FutureWorks has rolled out the new imaging process on several other projects.

 


DP Chat: Firefly Lane’s Vincent De Paula on Tackling Multiple Decades

By Randi Altman

Based on the best-selling novel by Kristin Hannah, Netflix’s Firefly Lane follows two best friends, Kate and Tully, over the course of three decades. The series is now streaming its second season.

Vincent De Paula

Vincent De Paula

DP Vincent De Paula, CSC — who has extensive feature and television credits — has been on the show since its inception, working with showrunner Maggie Friedman to get the right look for the many time frames the show depicts.

You were the sole DP on Season 1 and Season 2. How early did you get involved, and how did that help?
I met with showrunner Maggie Friedman early on, when there was just one pilot script. We had a great meeting. We clicked right away while talking about the look and style I had in mind for the show.

I thought it was a fascinating story about friendship with American culture and history as our canvas. We could cover many topics as the background of our story and emphasize how things have changed for women regarding equality and rights from the ‘70s to today.

I also connected with this story a lot; I need to have a connection with the stories I am working on. I remember growing up in Spain with my best friend and how everything back then was about creating adventures, exploring life, dreaming about the future. All these memories and experiences were a key factor in how I saw this story from a teenager’s perspective.

When I was hired for the job, some of the real locations had already been chosen, so sadly, I didn’t have much input on those, and some have proven to be quite challenging logistically. But I had enough time to develop the look and style I had in mind.

What were the challenges (or benefits) of being the only DP?
Because I was the only cinematographer on the show, I didn’t really have time to prepare episodes with the upcoming directors or scout locations properly, but we tackled this show as a long feature film, with a specific look that would change between all the different timelines. And having just the one voice behind the camera allows for a very unified and consistent flow throughout the episodes.

Did anything change significantly from S1 to S2?
I decided to change lenses for Season 2. We had Cooke S4s in Season 1, and we moved to Panavision Panaspeeds this season. These weren’t available for us when we started filming last season. I used Panaspeeds while on the TV series Maid, and they have become one of my favorite lenses. I have used Panavision Primos extensively in my career when shooting with spherical lenses, but they are very popular and weren’t available for us last season.

Vincent De Paula

Vincent De Paula

The Panaspeed spherical primes are a high-speed, large-format companion to 35mm-format spherical Primo optics.

We had created a style and look in Season 1 through lighting, framing and camera movement that we carried on this season. Season 2 has some very strong dramatic moments, and we introduced new plots that required their own style of shooting. We also briefly introduced the 1990s as another timeline on the show that had its own style.

One of the main differences is that we built more sets this season instead of relying so much on location shooting, which we did in Season 1. Our 1970s interiors, the 1980s apartment in Seattle, and the 1980s news station were built on stages in Vancouver, BC.

When it comes to period stories, smoke/haze also plays a part, especially in the 1970s and 1980s. Unfortunately, COVID and other factors prohibited us from using as much haze or smoke as we wanted.

Can you give more detail about the looks you established for each time period?
I wanted the different decades to have distinctive looks, although we did not want the different periods to be too radically different. Of course, when filming a period drama, everyone interprets how these different decades should look based on history, culture, films, photographs and experiences. But I wanted to approach these different looks from an emotional and character perspective rather than just a period-accurate perspective. Transitions also play a huge part in our visual vocabulary, especially when transitioning between different periods, so we are always trying to find interesting ways to create these.

The core of our main story lives in the 1970s, 1980s and early 2000s.

The 1970s has the warmest look in the whole series. It is our happy and warm period. This is a time when our girls get to know each other and explore youth together. In the ‘70s, yellows and greens are very prominent, with milky blacks suggesting a pastel feel.

For the characters, it should be about exploration, hope, adventure, youth, friendship and learning, creating an environment that should generally feel safe and warm. It should be the time that the girls would always look back to, their special moment, dreaming about an amazing life ahead of them, before they would grow to experience the reality of life.

To help achieve this overall tone for the period, I had stockings in the lenses and an 81EF filter at all times. There was almost always a hard and warm light coming in through the windows. As both characters have very different personalities, I also wanted a different approach for our camera movement and framing for this period. I introduced a more dynamic feeling to young Tully’s character (played by Ali Skovbye), contrasting with a more still and isolated feeling to that of young Kate (played by Roan Curtis). It was more obvious earlier in Season 1, and as her relationship with Tully matures, they will share the frame more.

What about the ‘80s?
The 1980s have a deeper contrast with a more saturated palette since the ‘80s had more vivid colors and a particular look when it comes to clothing and hairstyle. Therefore, I introduced a different filtration for the 1980s using Schneider Classic Soft filters of different strengths.

At this point, our characters are experiencing the real world, first jobs, relationships, etc. Everyone at this age has a higher energy that should also be part of this style, so the camera movement can get even more dynamic. Here we are not so much observers of two girls growing up together as we are participants, so I feel we have now moved in closer to our characters. The use of wider focal lengths closer to our subjects helped achieve that feeling. We want to feel like we are there with them, helping them transition into adulthood and the real world.

Instead of casting different actors for this period, like in the 1970s, Katherine Heigl (Tully) and Sarah Chalke (Kate) played themselves in the 1980s too, so we were doing de-aging in post production to help sell their younger selves.

What about the 2000s?
We treated the 2000s as our “present” period. In Season 1, we showed how Tully had had a successful career, contrasting with Kate, who is struggling career-wise but who managed to start a family. Framing for this period is more dramatic, and some scenes feel like the framing is calling for a more short-sighted composition. Until now, we have seen our girls growing and becoming women, and we have witnessed the development of their strong relationships. But now, in this period, we see more of the ups and downs of two mature women dealing with the routines of everyday life.

Overall, it feels more current, and the camera movement is looser for this period. I had a subtler filtration for this period, with the use of light Black Satin filters (or none at all, at times) and softer lighting coming through windows. The images have a more desaturated palette overall.

What direction did the showrunner give you about the look she wanted this season (and last)?
Our showrunner, Maggie Friedman, is not only a great leader in our show, but the writing she brought to all the scripts was just so good that it was amazing to be able to translate those words into visuals. We had a great collaboration together that I hope will carry on in the future. When I am presented with such quality scripts, it makes my job so much easier, and it allows me to dream bigger when prepping the episodes.

Did you work with a look book?
I always work with a look book. Last season I was gathering references from photography and other shows as a way to communicate our visual language to the directors and crew. I look at photography a lot for references and inspiration. Saul Leiter, Stephen Shore, André Kertész, Alex Webb and Todd Hido, among many others, are always present in my visual language as inspiration.

This season I used images from Season 1 to create a visual lookbook for Season 2 and a bunch of references for some new periods we were about to cover.

How did you work with the directors and colorist to achieve the intended look?
All the directors that came in this season were also fans of the show, and they knew it really well. We established a specific look in Season 1 that we continued this season, so everyone coming in was familiar with it and knew the look we were trying to achieve. I also shared my look book with everyone, and it was a pretty flawless process overall.

Company 3 has been taking care of our dailies and color timing for the whole show since last season. Claudio Sepulveda was our colorist, and Chad Band was our dailies colorist.

Prior to Season 1 of Firefly Lane, I shot the feature film 2 Hearts, where I also had the same team doing the color correction for me. So I knew the team very well, and it was a great collaboration again.

Were there on-set LUTs? DIT?
I always use just one LUT on every project, and I light for that LUT. Every now and then, we would make some subtle CDL adjustments that would go straight to our dailies colorist at Company 3. But I always try to get the look in-camera as close to delivery as I can.

Brian Scholz was our DIT on Season 1, and he came back for the remainder of the series. He knew the show so well, and it was an amazing collaboration once again.

Where was it shot, and how long was the shoot?
The series was shot in Vancouver, BC. We only filmed 10 episodes in Season 1. For this season we had 16 episodes to film over nine months, plus about four to five weeks of preproduction. Netflix is splitting Season 2 into two parts. Part 1 released on December 2, and part 2 will be released sometime in 2023.

How did you go about choosing the right camera?
Last season, Netflix’s mandate to originate in 4K ruled out my camera of choice, which is an ARRI Alexa, so I tested the Panavision DXL2 and the Sony Venice. I already knew what I could get out of the Venice, but I was pleasantly surprised by the images coming out of the DXL2. I also love its ergonomics and especially the viewfinder.

I also introduced the idea of filming the series for a 2×1 aspect ratio in Season 1, as it would fit these two characters’ stories, allowing us to frame them together and have them share the screen more often than not.

Can you talk about using lighting and framing to emphasize the emotional weight of the scenes?
As I described earlier, every period has its own approach toward lighting and framing, though I like to play this in a pretty subtle way between all the timelines. But the camera work is definitely more dynamic in the 1970s and especially in the 1980s. We used the Steadicam for those eras more often than any other periods.

The 2000s timeline feels a bit more static and somewhat the camera is a bit looser. The framing is also less “centered” than in other periods.

Vincent De Paula

Vincent De Paula

Lighting-wise, I used harder and warmer lighting for the 1970s to evoke emotions from that time, when our girls are still teenagers. I gradually change to a softer approach for the ‘80s and a cooler, more neutral tone for the 2000s.

There are times in the 2000s when we wanted to isolate some characters due to the emotional scenes they were playing. I tend to short-sight the compositions and use wider lenses that allow us to identify with the environment that surrounds the characters.

Any happy accidents to talk about?
There are always happy accidents on a film set, and I am the first one who will embrace them.

I remember there was a scene we were filming in the 1970s timeline, when young Tully is visiting her mom “Cloud” in jail, and eventually they would be sitting together in a table in the middle of the room. I wasn’t planning on having a two-shot with the window in the background, but as the camera was rolling into the set, it was pointing at this window and table with the stand-ins sitting there. I noticed how powerful it could be to actually let them be in a silhouette against that window, so I decided to light them that way instead, and it was only because I just happened to look at the monitor as the camera and dolly were getting to set and were “accidentally” pointed at this table.

Vincent De Paula

Any challenging scenes that you are particularly proud of or found most challenging?
Filming in Vancouver in the fall and winter has its challenges. In addition to the seasonal rain, it gets dark pretty quickly. Many times, I had to film night for day, and some of the locations were quite challenging in order to pull this off.

Earlier on in Season 2, Tully is filming a documentary in which she is trying to trace her father’s past and whereabouts. There was a scene where they all visit a restaurant with the camera crew, where they believed Tully’s father had worked in the past.

Due to scheduling reasons, we had to shoot at this location in the evening when it was already dark. There were windows all along one side of the restaurant. We had shot another scene there for the 1980s that plays in the same episode, and in that scene, I was able to feature those windows fully. But for this scene, sadly, there was no room to place any lights outside those windows, as the restaurant was over the ocean. So any lighting had to come from inside the room.

My approach was to deny seeing that part of the restaurant and place the fixtures inside the room as close as possible to those windows. In the background there was a door leading to a patio area where there would be more tables for customers, so I had a bigger light over there to recreate where the sun would be coming from. Overall, it looked really good, and to this day, no one can tell that it was actually night when we shot this.

Also, not being able to scout this real location beforehand brought more challenges because I had to come up with a very quick plan to light the space with its limitations, and I was only able to see this location on the actual shooting day.

Now more general questions….

How did you become interested in cinematography?
I was born in Galicia, in northern Spain, where the film industry is almost nonexistent. There is no film background in my family, so it wasn’t the path my parents probably expected for me. So when I mentioned my desire to be involved in the “movies,” it was pretty clear that I would have to move elsewhere.

After I moved to London, I got involved in documentaries, music videos and many commercials early on in my career and then I slowly got into more narrative work.

The rest, as they say, is history.

I was always watching films as a kid, and I remember thinking that I would always get something out of any film I would watch. Even if it wasn’t a great film, there would always be a message or a great adventure to witness. That sparked my attention, and like everyone else, I wanted to be a director, but I quickly discovered the importance of an image and all the things I could say with the use of light and composition, so I decided I wanted to be a cinematographer.

When I moved to the UK, I started working mainly on documentaries, and this taught me so much about using natural light and how to use what was available to tell a story. It allowed me to develop a naturalistic approach that I still always prioritize today.

When I started doing more narrative, commercials and music videos, I was able to apply that naturalistic approach.

I tried to enhance it to help the story in a more dramatic way, which I have since been calling a “poetic realism” approach. I knew I wanted to do this for the rest of my life, being able to paint and write with light and composition to tell a story.

Short films were my introduction to narrative. I also learned how wonderful the collaboration with the director, the production designer, the gaffer and all crew members could be.

Vincent De Paula

Vincent De Paula

It’s always important to be bold and push your creativity in every project you do, and I have been learning new things all the time. I was at a point where I was filming mainly on 35mm and S16mm, even though digital already had a presence. But learning to expose and work in a film environment is the best school. All the projects I did early on in my career were telling me that I had found my path.

What inspires you artistically?
I am constantly looking at photography and painting as main sources of inspiration. I think I have more than a couple of hundred books on photographers and painters. Saul Leiter, Stephen Shore, Gordon Parks, André Kertész, William Eggleston, Alex Webb, Roy DeCarava, Todd Hido and Fan Ho are some of the photographers I always reference.

I also love the masterly treatment of light by painters like Vermeer, the use of color and perspectives of de Hooch, the chiaroscuro of Caravaggio or Monet and the Impressionist style.

I always learn so much and find so much inspiration from the work of cinematographers like Conrad Hall, Gordon Willis, Sven Nykvist, Nestor Almendros, Ed Lachman, Robby Müller, Chris Doyle, Robert Richardson, Janusz Kaminski, Roger Deakins, Emmanuel Lubezki, Rodrigo Prieto, Linus Sandgren, Greig Fraser, Bradford Young and Natasha Braier, to name just a few.

And away from any visual references, I am always listening to music. I think if I wasn’t a cinematographer, I would have tried to become a musician.

Literature is also a huge influence for me, and I am pretty obsessed with the Beat Generation.

What are some of your best practices or rules you try to follow on each job?
I like “fixing things in preproduction,” and I always do a lot of research on the subject or themes I am filming. I think that one always has to have a plan. Even if it is a very small scene with very little time to prepare, you always want to have a plan to execute, or at least have an idea that usually develops into something bigger when on-set on the day.

I always have so much fun on the job, and I think the cast and crew feeds from it. I am very passionate about my job. I believe I have the best job in the world, I love what I do, and I am not shy to show that on-set.

Explain your ideal collaboration with the director or showrunner when starting a new project.
When I first read a script, I don’t want to immediately have an idea of what I want the film or series to look like. Naturally, as I read it and react emotionally to the story, I start to develop ideas in my head, but I like to come to my first meetings with showrunners and directors with a blank page that I will gradually fill with references and ideas to a look that I present to everyone involved. But I do want to hear their initial thoughts too.

Communication is key, and looking at references — discussing films, photography, painting, etc. — is part of that initial process. Even if one wants to have a very distinctive look, there is always room to look at other forms of art for inspiration.

It’s also very important to connect personally with the director I’m working with. I don’t mean we need to become best friends, but I have learned to read people quite well, and I like to know what goes on inside everyone’s head when working together on a project.

What’s your go-to gear (camera, lens, mount/accessories) – things you can’t live without?
I became a cinematographer in England, and at that time, digital was starting to be very present, but I was lucky to shoot on film early in my career. Learning to expose for film has taught me so much and has given me great confidence in my work as a cinematographer. I still love to shoot on film, and I think of it as another pencil with which to write a book.

Lately, I have been mainly shooting on ARRI Alexas, and it is my favorite sensor to shoot on. I think it is still the closest look to film to date. I love Panavision glass. I have been working with Panavision on 90% of all the projects in my career, and it is such a wonderful collaboration with them. They have always had my back, from my time in London to the US and Canada and beyond.

When it comes to anamorphic, which is really my preferred format, I love the Panavision C series and T series. I have shot with both on my last two feature films. One of them luckily had a large theatrical release worldwide where you can really appreciate the larger aspect ratio.

I genuinely think the wider screen from an anamorphic image can also be a really intimate format. You can frame two actors in a medium close-up in the same frame and let things play, and it allows the camera to move in a way that doesn’t force you into as much cutting.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 25 years. 


DP Mandy Walker on Baz Luhrmann’s Elvis: Shooting, LUTs, More

By Iain Blair

Aussie cinematographer Mandy Walker, ACS, ASC, who collaborated with Baz Luhrmann on his sprawling epic Australia, teamed up with the director once more on Elvis. An epic in its own right, Elvis conjures up the life and times — and rise and fall — of this rock ‘n’ roll icon. Starring Austin Butler as the poor white kid from Tupelo, the film is told from the point of view of Elvis’ manager, Colonel Tom Parker (Tom Hanks). And as Oscar season starts up, it’s been getting a lot of buzz. And for her part on the film, Walker has become the first woman to take home the AACTA’s Best Cinematography award for Feature Film in Australia, and she has been nominated for an Oscar in the same category.

Mandy Walker

I spoke to Walker, whose credits also include Mulan and Hidden Figures, about the challenges of shooting Elvis, the cinematography and working with the DIT, DI and VFX.

This is an epic story. How did you approach the look of the movie with all the different eras stretching from the ‘50s to the ‘70s?
We basically divided the story into two parts and used different lenses to tell the story. For the first part, when Elvis is growing up in Tupelo, I shot spherical in what we called “black-and-white” color that’s a desaturated look with pushed blacks. Then, once he got to Las Vegas, we used anamorphic lenses — old glass from that period, with more aberrations. We also had different LUTs for each period.

When Elvis is 10 and running to the Pentecostal tent, we shot it with the black-and-white color look. It was a very considered color palette that we’d researched from the period. Then, by the time Elvis got to Hollywood, it was more Kodachrome-looking, and I had more depth of field, more color in the lighting and more contrast. Then in Vegas, there were bright, garish colors, very ‘70s, with lots of flares.

This is your fourth collaboration with Baz. How did it work on this?
Baz is very good at explaining the story he’s making and the whole emotional journey. Then it’s a matter of me interpreting all that visually. And as he’d been working on this for 10 years, he’d done so much research, and the visuals are so important in this.

Fair to say that initially the camera seems to be constantly moving – right from the carnival Ferris wheel scene at the start?
Yes, we wanted it to fly. But later, when it all settles down and the drama gets heavy, the camera moves far more slowly so you focus on the situation. When Elvis is with his mother, it’s slower. Then later, in his Vegas hotel room when he can’t sleep, the mood is darker, and the camera reflects that.

How long was the prep?
We had a lot of prep on this movie — 16 weeks — and we went through everything meticulously. We were just about to start shooting when we had to shut down for four months when Tom Hanks got COVID, so we had even more time to do tons of testing on cameras, lenses and so on. Baz loves to test and experiment, and we also worked closely with all the other departments – not just costume and art direction, but all the VFX. Really, post is part of prep now on a film like this.

Did you do lots of shot lists and storyboards?
Yes, but not for everything. It was more about making the connections between scenes and sequences. For instance, for the bit when young Elvis runs from the gas station to the juke joint to the tent — that was all storyboarded, as it was all a build.

We also built the Beale Street set and Graceland exterior and interior, all on stages and backlots. That way, we could design all the camera moves and transitions and rehearse stuff physically on the sets before we even shot. Pretty much everything was shot on the biggest stages they had at Village Roadshow in Australia, and we also shot on three backlots for the carnival and Beale Street stuff.

Was there any talk about shooting in some of the real locations in the US?
Yes, early on, but we all soon realized we couldn’t, as it’s all changed so much now. Memphis doesn’t look anything like it used to when Elvis was there, and the same with Vegas. That’s why we had to recreate it all from scratch. There is a bit of archival footage of ‘70s Vegas in there, but that was it.

Mandy Walker on-set

How did you make all your camera and lens choices?
We decided to shoot on the ARRI Alexa 65, and Baz and I decided to go that way very early on. It’s an epic story, so why not shoot on an epic format? Then, when Baz was in LA around August 2019, we met up with [optical engineer] Dan Sasaki at Panavision and went through all these different lens iterations — some on 35mm and some on a 65mm camera — until we got to the right ones that were specially built for us.

I heard you also used a special Petzval lens?
Yes, mainly for all the flashback sequences and drug episodes. It’s based on an old projector lens from the 1800s and has a focal length of up to 160mm. Dan made anamorphic and spherical versions of it for us. It was perfect for helping to create that feeling of disorientation we wanted in those scenes because the focus is on the center of the frame and the edges are softer and fuzzier. It gives you this great vortex effect.

Did you work with a colorist in prep on any LUTs?
I did all of that with my DIT, Sam Winzar, and we began very early on in prep and testing. Baz and I would look at colors and lighting, and then we’d refine them when we got to our location or set. We put together a lot of references for the LUTs so all the transitions would be very smooth from one period to another, and we always knew where we were in time. Those LUTs translated into dailies. Sam and I would go into dailies every night, and if we had four or five cameras running, we’d tweak them a bit to make sure they were all balanced. Then Kim Bjørge, our dailies colorist, also ended up becoming our DI colorist.

Isn’t that very unusual?
Very. It was a big step up for him, but he’d been working on the film the whole time and knew it inside out. It worked out really well.

I assume there was a lot of bluescreen and set extension work, especially for the big concert scenes?
There was a lot, as we built all the stages and auditoriums for the concerts and shows. We didn’t use any real theaters, and the film’s full of big sequences, like the famous ’68 Comeback Special set piece. That was huge, as it was the high stage and backstage area and about a third of the audience. All of that was built, along with the whole studio and control room. So we used bluescreen for the rest of the audience and extending the auditorium.

It was the same for the hayride, the early concert sequence. We had about a third of the audience and built the whole stage and backstage again. We used a lot of set extensions for stuff like Beale Street. We built four blocks, but just one level. So the second story and the rest of the street were all added in post. Everything was very carefully planned out, and we did a lot of tests in prep so we all knew exactly what was in frame and what would be added later in post.

The Russwood Park concert is another good example. We shot all of that on a black stage. I put up stadium lights, and that sequence was all extended as well. All the split-screen stuff was planned too. The VFX team worked closely with us and did a great job of integrating with our in-camera work. I was quite involved in integrating all the VFX and post work with them, and we had a lot of VFX companies, like MPC and Luma, working on it. (Other VFX companies included Method, Slate, Mr. X, Rising Sun Pictures and Cumulus VFX).

We did it at The Post Lounge in Brisbane, and they also handled all our dailies and processing. I did all the sessions remotely since I was in LA on the Warner lot in the DI suite — I could see all the images from The Post Lounge in real time, and that’s how we did it.

We did quite a lot of work, especially adding some LiveGrain to match the older film stocks and for when we intercut with archival footage and for stuff like all the 8mm home footage sequences. VFX also added a lot of artifacts to those scenes. But I do have to say, the finished film you see is very close to how our dailies looked. It really did turn out the way we first pictured it, and I’m very proud of the way it looks.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Digital Orchard Promotes Jo Barker to Colorist

UK-based Digital Orchard, a capture-to-post digital company offering end-to-end workflow and color management, has grown its grading team with the addition of Jo Barker.

Having previously worked with Digital Orchard as a DIT and dailies colorist on numerous films and TV series — including Bridgerton, Suspicion, Belfast and A Discovery of Witches — Barker will develop the skills she has perfected on-set and in dailies color and transfer them to the role of full-time colorist. She will be based on-site at the company’s post facility, which is located in Chalfont St. Giles, near both Leavesden and Pinewood studios.

Having studied cinematography at the Czech film school FAMU, Barker started her career in post production scanning film for features and then creating dailies for major films and TV series. While doing this, she learned grading skills from top colorists and began grading herself. She next found her place as a DIT, combining her lab, color and camera knowledge and supporting DPs.

Through her time on-set and in post, Barker has demonstrated her adaptability and collaborative work ethic, making use of all the skills, technology and creativity of digital and film-based workflows. The transition to the role of colorist is a natural next step in her career. Barker will be working on Blackmagic Resolve.

“I’m excited to be making the move from set to being in-house at Digital Orchard. Since I joined them five years ago, they’ve been incredibly supportive in my progression to live-grade DIT, working with some brilliant DPs to help them achieve the early look of their projects,” explains Barker. “I found myself constantly wanting to take the look further than is possible with the limitations on-set, so I hope now to continue this collaboration and make new relationships in the grading suite.”

“We are delighted to welcome Jo to our post team,” says Sam Margaritis, managing director of Digital Orchard. “She has been an integral part of Digital Orchard for a long time and we are very proud of her progression from on-set DIT to dailies colorist and now to final colorist. We are excited to see what the future holds for her.”

 

 

 

 


Colorfront Express Dailies 2022: Set-to-Post With ACES AMF

Colorfront — a developer of high-performance dailies/transcoding systems for motion pictures, broadcast and commercials — will be at Cine Gear Expo 2022 showing the soon-to-be-released Colorfront Express Dailies 2022 and the Colorfront Streaming Server Mini.

Shipping this summer, Express Dailies 2022 has additional features and is supported by fast processing and collaborative technology to ofer advanced adaptability for emerging set-to-post workflows. Colorfront will show Express Dailies 2022 running on the new Mac Studio M1 Ultra workstation, processing SDR/HDR footage simultaneously to Apple Pro Display XDR monitors. Thanks to engineering collaborations with Apple, Express Dailies 2022 gets extreme performance from the latest Mac Studio for a broad array of accelerated acquisition, dailies and finishing tasks.

Express Dailies 2022 also features support for the latest digital cinematography camera formats, such as ARRI Alexa 35 4.6K, with its ARRI Reveal color science, ARRI Textures and 17 stops of dynamic range, plus Red V-Raptor 8K VV, Sony Venice 2 8.6K and Blackmagic 12K RAW.

Colorfront has also upgraded the color management capabilities of Express Dailies 2022 to embrace this new era of workflow operations and provide a versatile option to help users streamline end-to-end, color-accurate workflows.

Express Dailies 2022 offers full support for ACES 1.3, the latest Academy Color Encoding System, including ACES Metadata File (AMF), which is designed for greater flexibility when implementing ACES viewing pipelines.

AMF — a sidecar XML file carrying the metadata recipe — enables the exchange of accurate, clip-level metadata between on-set, dailies and finishing processes. It contains a detailed pipeline description with all of the various input (IDT), output (ODT), look-modification (LMT) transforms that need to be applied to a clip to recreate the “creative intent,” plus the working color space and the version of ACES being used. The transport of AMFs via Express Dailies 2022 maintains consistent end-to-end color appearance and secures an unambiguous archive of the creative intent.

“ACES AMF takes set-to-post workflow to another level by removing the guesswork of applying color pipeline and color decisions across multiple applications on multiple shots from different cameras,” says Colorfront’s Bruno Munger. “Having ACES AMF within Express Dailies 2022 gives users a way to manage precise color communication between on-set and dailies through to editorial, VFX and post production mastering, and it extends the capabilities of Express Dailies 2022 to enable new and highly efficient workflows.”

Collaborative technology innovations with Pomfort and Blackmagic also further streamline end-to-end color operations. In an ongoing collaboration between Colorfront and Pomfort, the companies have implemented a unique round-trip workflow supporting Colorfront Engine in Pomfort’s Livegrade Studio on-set color grading system. The collaboration makes it possible to exchange clip-level ACES AMF metadata and color descriptions in ACES viewing pipelines.

Additionally, after several months of beta-testing, Colorfront is showcasing a new Colorfront Engine plugin for Blackmagic’s DaVinci Resolve color grading/nonlinear editing system. The new plug-in supports ACES AMF workflow, as well as the application of Colorfront Engine parametric looks and transforms during post operations, ensuring perceptually matching simultaneous SDR/HDR deliverables. LUA scripting in DaVinci Resolve can be used to apply looks to timelines containing multiple shots, while “blue fringe” problems from extremely bright/saturated objects, such as LED lights, are now properly handled with Colorfront Engine’s Perceptual Processing along with the included Academy Reference Gamut Compression algorithm.

Also on show at Cine Gear Expo will be Colorfront Streaming Server Mini. This new, software-only product for Mac/PC workstations allows digital artists to securely encode and live-stream frame-/color-accurate, reference-quality, work in progress over the public internet to multiple production stakeholders, wherever they are located.

Podcast 12.4

DP Eric Koretz on Shooting Ozark’s Series Finale

By Randi Altman

There are just a few shows that are instantly recognizable, and Netflix’s Ozark is one of them. That familiar blue hue highlighting the darkness of the storyline.

Eric Koretz

Well, sadly, it was time for the story of the money-laundering power couple, Marty and Wendy Byrde, to come to an end. And to help tell that story, DP Eric Koretz was brought on to shoot Episodes 8, 9, 10 and 14 of this last season of the show. Koretz joined the crew after the first seven episodes had already been shot by DP Shawn Kim, and his job was to continue to capture the distinctive look and aesthetic of the series while also tailoring it to his own style of filmmaking.

Let’s find out more from Koretz, who submitted the series finale episode, “A Hard Way to Go,” for Emmy consideration.

What direction were you given when you joined, and how did you also make it your own?
Shawn Kim had already masterfully shot the first part of Season 4, so I had a wealth of images to study before I shot a frame. Of course, being Ozark, it’s a filmmakers’ show, so producers are rarely looking over your shoulder, telling you what to do from an image-making perspective.

They trust the directors and DPs to be creative, take risks and tell the story in the Ozark way… meaning we don’t just do traditional coverage. We know the audience is sophisticated in their film language, so shots play out through space, movement and lighting without having to be didactic. Also as a producer, Jason Bateman is very involved in the cinematography of Ozark on a day-to-day basis. Since he’s in nearly every scene, Jason was always there to help guide the story.

What was it like joining an already show in its final stages?
Ozark was already my favorite show before I started on it, so I was familiar with and loved the look and design of the show. I felt right at home the minute I started shooting. It also helps to have an incredible crew in mid-season form. They’re all incredible craftsmen and whenever something felt off, we could always have a discussion whether this was the Ozark language or not.

That being said, as a DP you have to take ownership of the imagery. If you worry too much about “is this Ozark enough,” you’re not taking risks. I particularly love lighting, so I tried to bring elements of my style into the Ozark world while still making it feel singular to the series. How the characters are etched out of the darkness with lighting, or occasional pops of color when the story called for it, to framing with negative space. As a DP, when you’re in tune with the story and lensing from the subconscious, then it will be unique and still feel like the show.

Ozark has such a distinctive look — a sort of blue. It’s rarely ever truly sunny. Can you talk about how you handled this signature look?
Shawn started the season with a DIT — through camera tests and the first couple weeks — to develop a LUT. After that there was no DIT. I wasn’t used to working that way because traditionally I’ve relied on a DIT for my other set of eyes and to massage the color. But Shawn created a great show LUT and after a week or two I got used to that workflow.

We had a joke that there is no sun in Ozark. We would always start the outdoor scenes by taking out the sun. Once you do that you can bring in big lights to control the look. The key is in the control.

What about the lighting? We interviewed DP Ben Hutchins early on in the series, and he said this below. Does that still ring true? My hope is that it never feels like there’s any kind of artificial lighting on the actors or lighting the space. It’s something that feels more organic, like sunlight or a lamp that’s on in the room, but still offers a level of being stylized and really leans into the darkness… mining the shadows for the terror that goes along with Ozark.”
Absolutely still true. Ben did an incredible job of setting up future seasons for success on Ozark. He convinced the producers to buy into the look. That meant bringing in the cranes with 20×20 or 30×30 flags every day to control the light. That’s the basis for everything. You can create your negative fill and shadows with that and decide how much of the character you want to reveal with light according to what the story calls for.

We always try to have it feel as though the source of light is organic and natural. Subconsciously you want the audience to feel the moments are authentic and dark, and if they feel as though some movie light is hitting the characters, it can take you out of the moment. The characters are going to these dark places, and you want the lighting to enhance and tell that story.

You shot scenes with heavy negative contrast, shadow depth of field and evocative framing. Can you talk about that?
For one, we use shallow depth of field to tell story more than any other show I’ve seen. It helps isolate the characters and gives the audience a window into their soul. Same for the framing, subconsciously it helps unsettle and guide the audience into the characters state of mind.

How does the cinematography help tell what is essentially a very dark story. It’s almost a character in itself. 
For the heavy contrast and darkness, I think it shows the weight of the world these characters live in. Every move, every decision could be their last. So even in their lighter moments the darkness is still with them.

What camera was used for the final season? What about lenses?
We used the Sony Venice with Leica R lenses and a b-set of Leica Summicrons. We shot at 5.7K so both lens sets would cover the sensor. The Leica Rs, our primary lenses, were the 35mm 1.4, 50mm 1.4, 80mm 1.4, and the 50mm Noctilux which was a .95. Good thing our AC Liam Sinnot is the best there is. Pulling focus for a .95 on a push-in is nearly impossible, but he made it look easy.

Can you talk more about those LUTs you mentioned earlier? 
The LUTs that Shawn developed gave us a “fat negative” in that we could underexpose but still have room when we went to the final color without worrying about creating too much noise. We had a LUT for the Ozarks and a LUT for Mexico.

DP Eric Koretz on set (second from left, black clothing)

Did you send notes to the colorist? Possible to provide an example of a note? 
Occasionally, “Did we go to far?” (laughs) I underexposed a lot but was always pleasantly surprised with the results when we finally went to color. [Company 3’s Tim Stipan was the colorist for the show.]

What was the most challenging scene you worked on and why?
On the finale (Episode 14, A Hard Way to Go), Jason Bateman (who directed) designed a shot where we start inside the house as they Byrdes enter, pull back through a broken window into the backyard where Mel is waiting for them. For one, Jason wanted to pull through actual glass and not use VFX so choosing the right techno-head with all the complications around pulling that off where challenging. Also lighting from inside to outside for that was a challenge itself. Luckily, we have an incredible team of technicians — the 1st AC Liam Sinnot, operator Dave Chameides, gaffer Edison Jackson and key grip Landen Rudden and their teams — to pull it all off. It took a few tries and some quick thinking to change things up on the fly, but eventually the shot worked beautifully.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for 25 years. 

 

Podcast 12.4

CineDev’s Clip-Based Metadata Management App for On-Set, Post

CineDev, a software development company specializing in production workflows, has released Altera, a clip-based metadata management application for camera footage used on-set and in post production. Created with data wranglers and digital imaging technicians (DIT) in mind, Altera helps to streamline the editing or creation of metadata archive files such as ALE and EDL with tools to accelerate the process and minimize mistakes.

As production times reduce and the demand for media content increases, on-set production teams are increasingly looking at ways to streamline processes for quicker turnaround times. Editing metadata on-set and in film labs typically involve a manual process of editing spreadsheets that can lead to human errors that trickle throughout the production pipeline. Altera has been production tested over the past year during its beta phase on productions such as Wrath of Man, House of Gucci, Cyrano, Morbius, Doctor Strange in the Multiverse of Madness, Batgirl and The Marvels.

“We rely on Altera daily for every production,” says Gastone Ferrante, digital workflow supervisor on Cyrano (2021). “Our data managers have replaced a clunky workflow that involved three different tools to edit CSVs, ALEs and CDLs and greatly simplified the process with Altera. Working with Altera has saved us time and given us peace of mind by eradicating user errors for a truly trouble-free experience; we couldn’t have done it without Altera.”

“Our close collaboration with the industry during the development of Altera has helped us solve the metadata challenges production teams face when working on-set,” says Andrea Michelon, co-founder at CineDev. “The release of Altera gives data managers and DITs the ability to dramatically reduce the time it takes to accurately edit clip-based metadata. The ease and flexibility of Altera’s licensing makes working with metadata on-set more efficient and accessible to all.”

Features available in Altera include:

  • Import/Export: Import metadata from a wide variety of formats to transform and adapt the metadata for any need. Merge and export to multiple different formats in one go.
  • Smart Bins: Filter and group metadata attributes dynamically based on user-defined rules and keywords with Altera’s powerful smart bins. Work with an up-to-date list of clips that match the specified metadata attributes.
  • Database: Import reference lookups to take advantage of built-in tools in Altera such as Check and Compare. Supports a variety of file formats from a simple CSV to more elaborate formats such as MHL, CCC or AMF. Filter and clean up file data to optimize databases for efficiency.
  • Check and Compare: Verify data against any imported metadata or database with Check and Compare tools. Easily pinpoint mismatches in data to fix anomalies.
  • Merge and Append: Merge data from columns within multiple files or combine multiple files together. Altera uses smart checks to avoid possible conflicts when merging data.
  • Reports: Create and export reports in PDF or HTML formats. Easily customize layouts to suit your production needs.

 

 

 

LaCie 1Big Dock SSD

Review: LaCie 1Big Dock SSD Pro

By Brady Betzel

While the cloud is where a lot of creators store their images and videos, others still want to keep their files local. And for those who are offloading camera footage while on set, this usually requires fast and accurate copying and deletion. So what type of backup plan do you use? Personally, I am a fan of multiple backups.

LaCie 1Big Dock SSD

If you are on set, you will probably be hiring a DIT — or a production assistant if you have a tight budget — to handle media copy and verification. Usually that involves copying any media to two or more drives, running a checksum verification and possibly uploading to the cloud for immediate edit remotely. The bottleneck usually comes in the first link of that chain: the local drive input/output connection. These days, Thunderbolt 3 is the connection everyone wants. Not only can it supply power to a laptop, but it can also drive an external display, contain multiple memory card readers and have expansion for other drives or peripherals.

The LaCie 1big Dock SSD Pro is a perfect drive for this situation. It runs via Thunderbolt 3 and essentially acts as a docking station with a built-in NVMe SSD drive from Seagate, providing up to 2800MB/s read and 2100MB/s write speeds. It’s fast! And if you’ve never used an NVMe SSD drive to render or transfer files in and out, then you’re in for a treat. It will change your outlook on multimedia creation. You should immediately think about swapping out any internal spinning disk drives for NVMe drives. But I digress.

The 1big Dock SSD Pro comes in two storage sizes: 2TB for $1,349 and 4TB for $2,599. Both versions are identical other than the storage size. The speeds are blazingly fast. LaCie touts 2800MB/s read speeds and 2100 MB/s write speeds, but I will go over real-world speeds later in this review.

Externally, it connects through an included Thunderbolt 3 cable, but it also sports a separate, up-to-five-device-daisy-chainable (is that a word?) Thunderbolt 3 port on the backside; DisplayPort 1.4 connection; USB 3.0; SD card; CFast 2.0; and a CFexpress card slot on the front. The SSD drive itself is mounted in an interchangeable sled, so if something happens to the drive, it is easily replaceable. If you get to the point of replacing the sled/SSD, LaCie offers a five-year warranty that covers “…any defects in material and workmanship…” if the product was purchased from an authorized retailer or reseller.

Essentially, if LaCie deems the product failed because of its manufacturing, you’re covered. If you drop it and it breaks, or you fry it by dropping it in a pool, you will not be covered. You can read the details of the warranty here. It’s pretty standard, but one perk that is interesting is Rescue Data Recovery Services. With your purchase, you will receive a one-time data recovery attempt. If successful — they boast a 95% success rate — they will send you back an encrypted storage device with your data. If it works, that sounds awesome. Fortunately, I have never needed to use this service, but I would be interested in hearing anyone else’s experience with it.

The Nitty Gritty
Physically, the drive weighs about 2.8lbs by itself and about 4lbs with the included power brick. It is designed by LaCie’s resident modern designer, Neil Poulton, who is adept at designing “deceptively simple-looking mass-produced objects.” If you’re looking for a little creative inspiration, check out his site. He’s got some straight-lined objects on there.

LaCie 1Big Dock SSDWhile the 1big Dock SSD Pro is very light, the power brick is large and heavy. I wish they would have spent as much time integrating the power brick as they did designing the hard drive case, but that’s a small concern considering how fast the drive is.

I love that the 1big Dock SSD has so many input/output ports allowing me to connect multiple devices, which makes it not only a hard drive but a dock. The dock could power thunderbolt 3 devices like an external monitor or even charge your laptop.

Over the past few years, we’ve lost a lot of the built-in memory card readers in computers and drives. Even docking stations have eliminated a lot of readers in exchange for Thunderbolt and USB-C ports. So it’s a relief to see that LaCie added them back into the 1big Dock.

LaCie 1Big Dock SSDThe Thunderbolt 3 charging does come with some caveats: When your laptop is charging via its power supply, the Thunderbolt 3 daisy-chain port will deliver 70W of power. But when you are charging your laptop through the 1big Dock SSD Pro, the Thunderbolt 3 daisy-chain port will deliver 45W of power. And if you have an Android phone that supports fast charging over USB-C, you can plug it in! I would have loved if it had included a wireless Qi-based charger, but that might be asking too much. Either way, there are plenty of ports, including the DisplayPort 1.4 to dock your laptop with.

Testing
When using the LaCie 1big Dock SSD Pro, I ran it through a few tests using an HP ZBook with Thunderbolt 3 ports. First up, I ran some disk speed tests in AJA’s System Test using a 16GB, ProRes HQ, UHD testing base. The read speed was 1268MB/s and write speed was 647MB/s. In the Blackmagic Disk Speed test using a 5GB testing base, the read speed was 2902.4MB/s and write speed was 2327MB/s.

Doing a simple Finder-level copy, I copied a 44GB folder full of Red R3D files, DSLR .mp4 files, BMD Raw files, and more. It took 1 minute and 57 seconds. Obviously, keep in mind that will be highly dependent on the source drive, OS, etc.

Final Thoughts
In the end, the LaCie 1big Dock SSD Pro is a great-looking external drive that comes at a cost. It is definitely not cheap, but if you are even looking at this drive, you are probably willing to pay a premium to transfer gigabytes worth of files per minute faster than another drive.

Besides the large power brick (which almost every drive has), the form factor and color of the LaCie 1big Dock SSD Pro is great. It feels like it is made to match Apple’s Space Gray. The memory card slots, Thunderbolt 3, DisplayPort 1.4 and USB 3.0 ports really help eliminate an extra peripheral on my desk and can even help charge a laptop that is powered over Thunderbolt 3.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and Uninterrupted: The Shop . He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

Shooting Jungle Cruise Underwater

DP Chat: Shooting Underwater for Disney’s Jungle Cruise

By Randi Altman

Disney’s feature film Jungle Cruise, starring Dwayne Johnson and Emily Blunt and directed by Jaume Collet-Serra, is based on the theme park ride of the same name. The story follows the captain of a small riverboat who takes a scientist and her brother through a jungle in search of the Tree of Life. As you can imagine, some scenes involve water, so the production called on the aptly named DP Ian Seabrook to shoot underwater footage for the film.

 Shooting Jungle Cruise Underwater

Ian Seabrook

An experienced cinematographer, Seabrook’s credits include Batman v Superman: Dawn of Justice, Deadpool 2, It Chapter Two and the upcoming untitled Thai cave rescue documentary. We reached out to him to find out more about the shoot, the challenges and what inspires him in his work.

How early did you get involved on Jungle Cruise?
I was finishing up underwater work on the film It Chapter 2 for DP Checco Varese when the line producer for Jungle Cruise contacted me to see if I would be available for a sequence for his film. I then met the director, Jaume, producer Doug Merrifield, DP Flavio Labiano, special effects coordinator J.D. Schwalm and first AD David Venghaus Jr. in Atlanta to discuss the underwater sequence. The production was around a third into the schedule at that point.

What direction did Jaume Collet-Serra provide?
Jaume explained in prep what the sequence entailed and how he saw it playing out in rough form. On the shooting days, he was more concrete on how he wanted the cast to play within the frame. There were some storyboards as a reference, but much of the composition was left to me.

How did you work with the film’s DP, Flavio Martínez Labiano?
Flavio and I discussed the lighting for the sequences I was involved with, and I kept an open communication with him during the production regarding any changes I was making by visiting him in his DIT trailer on the main unit set. By maintaining an open communication, I find it yields the best results.

What sort of planning do you have to do for underwater sequences?
The first is figuring out what the sequence entails: How many cast members will be involved? What environment will they be in? What are the potentially dangerous elements? For Jungle Cruise it was the La Quila and puzzle sets with two actors — Dwayne Johnson and Emily Blunt — and their associated stunt personnel.

The next step is figuring out if the cast has prior underwater experience, which can make or break the success of filming the sequence. If someone is not comfortable being in or underwater, then the scene could be a challenge to get around. Dwayne had prior underwater experience on Baywatch, but to my knowledge, Emily’s underwater experience was less involved. That said, she did an absolutely amazing job in the water and was key to the sequence’s success.

In addition, it was necessary to have several meetings with the art and construction departments regarding the build of the puzzle set, as we had to go over what materials to use and not to use with regards to submerged set pieces and the associated hazards. Those hazards include the disintegration of paint and construction materials in the water, the primary concerns about which are running afoul of water clarity standards and the potential for ear or eye infections (which happened to both me and Amy Adams on Batman v Superman).

Where were the underwater sequences shot, and how long was the shoot?
The underwater sequences for Jungle Cruise were photographed in two tanks at Blackhall Studios in Atlanta. One tank was an exterior set, which was built in a parking lot at the second lot at Blackhall and used for shots involving La Quila and the cast transitioning into the water. The interior tank, which contained the puzzle set, was built inside one of the construction stages at Blackhall 2.

Shooting in a water tank

Can you talk about the puzzle sequence?
The sequence involved the cast swimming down from La Quila to the puzzle and holding their breath. In reality, it was not entirely different. Only the sets were separate, with the exterior tank being used for the La Quila set. The interior tank, which housed the puzzle set, required working within tight confines and limited mobility.

To achieve the shots required, I used my customized underwater housing, which has a small footprint and enables me to fit within the set and have enough room for the cast and stunt personnel to perform. Emily’s character gets trapped inside the set, and Dwayne’s character tries to rescue her, but due to his sizable frame, he cannot fit. Instead of resorting to passing breaths to Emily via mouth, we constructed the set piece outside of the tank then lowered it in once all materials had been dried and sealed. It then needed a few days for the water to settle, and I did daily checks with marine coordinator Neil Andrea.

What about other challenges?
The epilogue of the puzzle scene involved raising the set out of the water, so the discussion point became how to achieve this practically. As the shots required the camera and set to travel out of the water simultaneously via a construction crane (which was barely able to fit within the stage doors), the thought process was for the camera housing to be attached to the set via pipe rigging. This idea was short-lived because when I saw what the desired shots were and where the camera needed to be, I realized there would be no space or bracing point where I could attach any rigging. I suggested that I could hand-hold the housing for the shots, which was met with “Do you think you could do that?” It was a challenge to go from hand-holding an 80-pound camera housing in water, where it has slightly negative weight, to having the full weight of water pulling down as the set was raised, but the test worked. Of course, after that, we did it eight more times!

Shooting Jungle Cruise Underwater

How do you go about choosing the right camera and lenses for projects like Jungle Cruise?
I make every attempt to use the same camera and lens package as the main unit uses on the production, which in the case of Jungle Cruise was the ARRI Alexa SXT Plus with Panavision anamorphic glass. The 30mm C series was our hero lens due to its smaller size and weight, but we used a few other focal lengths as well.

What about the underwater enclosure?
The underwater housing is my own custom housing, which gives me access to all the exterior buttons for the Alexa: ISO, white balance, shutter or camera speed, all of which can be changed underwater. The housing also contains a TV Logic on-board monitor for viewing. I have several housings for different cameras. It makes it easier to have a housing for the camera that is already in use on the show.

Any “happy accidents” along the way?
Though the lighting was designed to illuminate the inside of the set with subtlety, there were moments when Emily Blunt would swim inside the set and the backlight, and small kisses of refracted light would hit her perfectly. I saw these on the monitor as we were filming, and they made me smile.

Ian Seabrook

Any scenes that you are particularly proud of?
Both Dwayne and Emily were wonderful to work with in the water, which made the sequence a success. The shots of Emily figuring out how to manipulate the puzzle were structured around a sequence of manipulations of the set pieces. We discussed what action she would be doing, but on that day, I went with how I felt the scene should be photographed and followed her action, which was somewhat balletic. With both of us in sync, the sequence came together nicely.

Now more general questions …

How did you become interested in cinematography?
From a young age, I had a desire to figure out how things like radios and televisions worked. That interest in the practical morphed into cinema as I watched films like 2001: A Space Odyssey, Lawrence of Arabia and Giant and began to wonder how they were made. Around the same time, I was watching a lot of Disney and wildlife documentaries on television, in addition to James Bond films like Thunderball, which had fantastic underwater sequences. I became obsessed with being underwater and how cameramen were able to be in the water with marine life like whales and sharks. Many years later I found myself in the waters around Cocos Island, Costa Rica, surrounded by schools of sharks with a camera in my hand. My dreams became a reality.

Shooting Jungle Cruise Underwater

Ian Seabrook

What inspires you artistically?
Robby Müller, who shot films for Wim Wenders, Alex Cox and Jim Jarmusch, is my favorite cinematographer. His ability to use available light on the films he photographed was unprecedented and is still a major influence to this day. I take inspiration from many forms: cinema, natural history films, music, art and photography.

Lamar Boren, who was the underwater cinematographer on Thunderball, and David Doubilet, who worked on The Deep, Splash and The Cove, are top of the list for me.

What new technology has changed the way you work (looking back over the past few years)? 
Taking LED fixtures underwater has changed what was a constant for underwater illumination. Smaller, lighter and, at times, more compact fixtures have transformed the lighting market. Where the dialogue used to involve lighting with attached cables and the associated boats with generators required to power them, LED housed fixtures without tethers have reduced the time and power requirements for underwater illumination. When I need a lot of punch for composite screen work, the industry-standard underwater lights still very much work, but the smaller and lighter fixtures have become indispensable, especially for travel.

What are some of your best practices or rules you try to follow on each job?
Arrive early, pay attention and remember why you are there. I bring enthusiasm to each project. I always remember my beginnings and strive to exceed expectations on each assignment. I work in many locales worldwide and try to involve as many local crew as I can. And whenever possible, I train those who are interested on the proper use of the equipment. I do a lot of my own prep and research for the assignments I do, in addition to the standard production prep. I also have backup plans.

Ian Seabrook

Explain your ideal collaboration with the director or showrunner when starting a new project.
I work best when there is a relationship built on mutual respect. There is always a reason that you want to collaborate with someone, and they with you. While I have been on my share of large, multi-personnel crews with a slew of trucks and trailers, it is the more intimate jobs involving travel and a reduced crew that have been the most memorable. I am quite capable of being autonomous and capturing sequences on my own while adding the right people to that mix, and nothing beats that. The same applies for the land-based second unit cinematography I have done — good people usually yield good results.

What’s your go-to gear? Things you can’t live without?
Much of the work I do is with the ARRI Alexa, which I have several housings for. I own my own Mini LF, but I rarely use it because it is usually working elsewhere on other jobs.

I travel a lot and always take my Leica M10 Monochrom with me  — I have a housing for that too.

What is in my bag at all times? My iPad makes scheduling and workflow easier while on the go, and I have housings for all my light meters, which I still use to this day.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

 

DIT Chat: Candyman’s James Notari

By Randi Altman

Set to hit theaters at the end of August, the Jordan Peele-penned Candyman is a supernatural slasher film directed by Nia DaCosta and starring Watchmen‘s Yahya Abdul-Mateen II. It’s a sequel to the 1992 film of the same name and the fourth film in the Candyman film series (and no matter what you do, don’t say Candyman three times. Trust me.).

James Notari

This location-heavy film was shot in Chicago just as everything in the US was shutting down due to Covid, which meant a very shortened last few days of shooting. “Principal photography started in fall 2019, and then we returned to Chicago in early March 2020 for a week-plus of additional filming — mostly new scenes and new locations with the same cast. We wrapped on March 15, and I was on a plane back to LA two days later,” says DIT James Notari, adding that thanks to some brainstorming and problem-solving, it all worked out in the end.

We spoke to Notari, whose long list of credits includes the recently shot TV movie Highland and the feature Underworld 5: Blood Wars, to find out more about the shoot and what his role as a DIT entails.

You were DIT on Candyman. Can you describe your role on this film?
My DIT role on this one was color management and exposure control. We used one show LUT that DP John Guleserian created with his colorist Company 3’s Natasha Leonnet.

Using Pomfort’s Livegrade Pro, I would create CDLs for each shot/scene to maintain the consistency of the show’s look and tone. There was a loader on set doing two media breaks daily. Breaks included ARRI picture, sound, camera reports and DIT reports/stills. Technicolor’s dailies colorist Cory Pennington worked off site creating dailies/post deliverables using Colorfront Express Dailies.

Any specific challenges on this one? 
Candyman was a very heavy location show with lots of nights while in Chicago. I think there were only five soundstage days total. This made for some great visuals but added just a lot of extra pushing and crew set-up/break-down times. The very last day of filming was probably the most challenging. Due to the sudden arrival of Covid in early March 2020, everything was quickly shutting down. Each day we would hear of another show in Chicago closing shop.

Candyman had three final days left on the schedule at that point. The solution was to compress the final three shooting days into one day, with two full units running at the same time. I was the sole DIT for these two units, both shooting on two separate stages — a thousand yards apart. It was a challenge to run so much cable, signal/power boosters, etc. Thank God for the amazing utilities we had on that show. I stayed with the director and DP on the main stage, and the signal from both sets was fed to my cart where I live-graded both simultaneously. We had five ARRI Alexa LF cameras that day. The graded images were hardwired out from my cart to two different VTRs, both on two different stages as well. It was messy, but it worked.

Candyman was handled with such care and attention to every single detail. The producers and director never budged on letting things just “slide.” We actually filmed in the real Cabrini Greens from the original Candyman movies. It was like a huge backlot. A ghost town as well, with mostly fenced-in, boarded-up lower-income housing.

In this Candyman, the lead actor plays a struggling artist and his partner a curator. So, all of the artwork had such purpose and meaning in the frame. Each painting/drawing was created by a local artist, and massive art installations were driven in from New York City. It was so great to be part of a project with so much passion for the process.

Can you talk about working with Candyman director Nia DaCosta and DP John Guleserian?
I went to AFI with John Guleserian, back before DIT was even a crew position, and we have similar circles of friends. It was nice to reconnect on Candyman. John is a super chill cinematographer and is all about capturing the director’s vision and creating a fun and peaceful environment on set.

This was Nia’s first big movie, and so she did rely on John to run the set for sure, and they both got along fantastically. A few days into the shoot, Nia started to sit at my cart with John, and that lasted the rest of the show.

How did you become a DIT? What kind of training did you have?
I started my film career in the VFX/post world as a VFX coordinator and digital artist (painting out cables, signage, roto work, cleanup). This was around 2008 when the Red camera was introduced to the world and digital filmmaking exploded. I was actually on the first Red feature, Gamer, as a VFX coordinator.

My familiarity with these Red cameras/R3D files would lead me into my next career move as a media manager on set. This was a brand-new world of the digital wild west. For the first time on set you could color correct, sync sound and transcode all post/client deliverables. It was an exciting time. In 2011, I worked on Underworld 4: Awakening in Vancouver. This was the first Red 3D feature, and it was a beast of a job. I was operating two carts side-by side, transcoding all dailies as well as daily lunchtime-synced 3D screenings projected in a 52-foot trailer on set. I created a workflow chart/pipeline breakdown for that show and follow it still today, with adjustments per show of course.

I continued my role as dailies supervisor until a DP asked if I could also do color grading on set in addition to my dailies supervisor role. I did these dual roles for a few years and then transitioned into the DIT position, collaborating with the DP and director and the camera team, in the moment — live. I haven’t looked back. I love it. Today, I rarely do backups and transcodes on set. I mostly focus on live grading, exposure and image control, as well as overseeing the loader and workflow process from set to post.

How would you describe your job to someone who might not know what a DIT does?
I tell people it’s like live Photoshop.

Do you prepare your cart differently depending on the project?
I really don’t prepare my cart differently for each show. I just keep adding gear to it as I go.

As a DIT, we live and die by the gear we use on set, and we are only as good as our carts. I designed my cart and have tweaked it over the years based on different experiences per show.  At this point, since I spend so much time with the DP and director on set, I try to keep a very streamlined and clean work area.

The frame of my cart is an Inovativ Echo 36 with two Sony PVM A250 monitors for viewing and a control monitor for grading. My entire cart is Mac-based, with a Mac Trash Can and an Ethernet hub that routes my hardware. I carry a 32TB Pegasus RAID from Promise for any external use, APC backup battery, Blackmagic Smart Scope Duo 4K waveform monitor, Blackmagic 12×12 video router with Master Smart Control and two Blackmagic Ultra Studio 4K capture devices. Pomfort’s Livegrade Pro is my main grading software. For more extensive color control, as well as stepping deeper into the RAW files, I have Colorfront Express Dailies. If I am involved with any media backup/MD5 checksums I use Pomfort’s Silverstack.

What are some best practices you employ on each job?
Fifty percent of the DIT’s role on set is to be the DP’s therapist. What the DP says in the tent stays with you. Being a good listener goes a long way, and a calm personality helps too. No DP needs to know how your cart works; they just want to have that safe place to go on set away from the madness — to sit and see the images, hold the single channel(s) and talk to a friend who they trust as a second pair of eyes.

How do you like working with the DP? Do you have any contact with the colorist at all? 
I love working with the DP. The collaboration and discussions we have in prep and then moving into production where it all comes to life — it can be very special. In prep, I am in contact with the DI colorist, where a LUT(s) is created with the DP, but once we hit the ground running, I’m then in constant communication with the dailies colorist — daily texts, emails and calls. The dailies colorist is a big part of the workflow.

Do you set looks on set? What tools do you use?
As mentioned earlier, I use Pomfort’s Livegrade Pro to live grade and create CDLs for each shot/scene to keep the consistency of the show’s look.

How do you like the DP to describe the look they want?
I ask the DP what movies I should reference, or if they have any stills I can view, but usually it’s a quick chat. Then we fine-tune those ideas with the show LUT in prep. Within the first week of production, we usually have a handle on where the look is going to live.

How is creating a look on set making it easier for the colorist?
The master plan is that the grading adjustments we do on set carry through to the dailies colorist and then to the final step with the DI colorist. It’s the vision the DP has on set that we try to protect. Throughout each production, I stay in constant communication with the dailies colorist each day to ensure dailies posted online will match the grading from set.

After dailies have been created at the lab, the colorist will then send high-resolution graded stills for me and the DP to view. This is a last step to make sure we are all on the same page and that everything is translating correctly down the pipeline.

In the end, we all hope the DI colorist will stay within that world that was created on set. I am not sure if what DITs do on set helps the DI colorist or not. I have heard both sides of that discussion. What I care about most is that the DP is happy when they are in the DI session. That what they see in that dark theater is what they remember seeing on set. No surprises.

How is what you do making it easier for post?
Since I have an extensive background in post, I go out of my way to make that conversation/pipeline as smooth as possible early on. Although DITs are in Local 600, constant communication with post is a key aspect of the job. We deal with all departments. We are the sole contact on set that post will reach out to regarding mag clearance, LTO, breaks, flicker, banding, lost pixels, compression, fps, framing charts, etc. In prep, I contact the editor, post supervisor, sound mixer, loader and the dailies colorist to make sure that we are all on the same page. This started years back when I was doing dailies on set and discovered how necessary this line of communication was. On each show there is an HOD workflow group call to discuss this as well in prep. However, after that call is over, as crew members on the ground, it’s up to the DIT to manage this from set. At least that is how I approach each show that I’m working on. One contact that people can call. One person who oversees all data from set and storage of mags, etc. To keep it simple is a good idea.


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 20 years. 

DIT Chat: Tyler Isaacson on Workflow, Best Practices

Over the years, Tyler Isaacson worked his way up the production ladder, first from PA to camera PA. Then, to broaden his production skills, he was trained by DIT Sam Kretchmar, one of the first DITs in Local 600 when the classification was created. He’s now been working as a DIT for seven years, with a focus on quick-turnaround commercials for brands such as Tide, Snickers, Ford, Progressive, Nintendo and Starbucks.

Ford Mustang Mach-E

In a recent chat, Isaacson talked about the intensity and demands of a daily shoot.

What do you consider to be your most significant issue when preparing for a shoot?
Setting up the right workflow with hardware and software tools that will get the best results, quickly. Once I know the location and how the shoot is set up, I organize my cart for maximum efficiency. I use modular components that I can easily combine in different arrangements. I build out my cart with everything on a Yaeger Junior. cart for stage jobs, or I split it up and pare down to a vertical Magliner cart for tight locations. 

In both situations, I use two core components — a video distribution/live grading kit and my transcoding workstation. I then add monitors (typically 17-inch and 25-inch Sony OLEDs), scopes, battery back-up, wireless receivers and other components the shoot requires.

What software are you using?
Right now, I’m primarily using Assimilate’s DIT Pack that combines Live Looks for on-set live grading and Scratch Dailies for transcoding. The seamless software integration streamlines my workflow for a huge boost in time savings. Because the software runs on both Windows and macOS, it gives me flexibility in my work and in building my DIT cart for different projects. Working this way is miles beyond just applying a CDL as a starting point. Not only do I have more control over the live image with curves, especially hue-hue and hue-sat curves, but being able to apply those exact same curves in Scratch Dailies and then being able to edit them is another time savings.

I also use Hedge for archiving media with checksum. Lattice is a handy app for converting LUTs between applications and viewing the LUT curves, which can be useful when evaluating an imported LUT.

How have you built your dailies cart?
I’m working on a custom Windows 10 PC that I have built into a Pelican 1510 rolling case. With a 14-core Xeon CPU, Nvidia RTX 2080 Ti GPU and 8TB of SSD RAID, the speed for rendering dailies is incredibly fast. For my external RAIDs, I like SanDisk USB C SSDs. They offer great value for the performance. I also use external Glyph 4TB USB C RAID drives for high-performance shuttles

What do you enjoy most about working as a DIT?
My job is the most rewarding when I’m able to collaborate with the DPs to achieve their vision. Having the powerful color tools and curves editor available live not only helps me and the DP set looks faster, but it gives us more creative latitude too. A DP may not get a chance to sit in on the final grade, so achieving a look in the dailies is often the only chance the DPs get to review their work. 

How do you like to work with the DP and other departments?
Since I’m working directly for the DP, this is where most of the collaboration happens. Sometimes the gaffer and I will sort out a flickering light without bothering the DP, but for creative decisions, it’s important to follow the chain of command.

The relationship with post and production is important too. In commercials, we are usually making transcoded dailies as we work, so coordinating with post before a shoot ensures we generate the correct files for them. A big part of the DIT’s job is making sure all the deliverables for a project — source camera files, sound, transcoded dailies, LUT files, reference stills — are well-organized and quickly completed.

How are you handling the live grading?
While I have used a Mac for live grading, I also wanted reliable software that could run on other operating systems. Since I was familiar with the Scratch Dailies UI, Live Looks fit into my workflow. I can quickly pull up old grades and match grades to dailies as I work. Having a look “memory,” in addition to previously saved grades, makes it easy for me to bounce between grades while working on a particular shot. Also, exporting looks into Scratch is saving me time on my dailies grade

My hardware is built around Live Looks and FSI BoxIO, which pulls live images from it and all the embedded metadata. I built all the gear into a small-form-factor 8020 rack for portability. It contains a 16×16 AJA Kumo router and two FSI BoxIO units to live-grade up to four cameras.

What specific aspects do you like about your tools?
Bottom line, it’s the speed, reliability and flexibility. Being able to manage resolutions, frame rates and color spaces individually by timeline (or reel) is hugely helpful, especially when dealing with multiple cameras and formats. Instead of just setting scaling “entire image to fit,” I can actually see and adjust how Scratch is managing the scaling on a per-timeline basis. I’m able to easily handle footage for the same project from a wide range of cameras, even phones, as well as different formats and resolutions on the same camera. It’s also easy to generate multiple export formats at different resolutions from the same material.

What are some of your best practices that you can share?
Because the DIT is solely responsible for all of the footage from a shoot, I think one of the best practices is to approach a job with a calm and organized mindset. If I allow myself to get stressed out or overwhelmed on a shoot, that’s when I’m most likely to make a mistake.

When there’s a hiccup on set — corrupt media, accidental reformat, camera issues, etc. — I always take a step back, assess and move forward with a level head.

I also like to use manual systems for rechecking my work. I make manual media reports, not because there aren’t great software tools that can automate this for me, but because it forces me to recheck card transfers one by one.

I also like to line up all of the original camera clips and transcoded dailies from a day on overlapping timelines to ensure they are frame-accurate. Ideally, I will compare every transcode to the source clip before I reshoot a card. Whenever I do catch a mistake while doing one of these manual reviews, it reinforces my confidence in the system overall.

What tips do you have for someone starting out?
Find mentors if at all possible. I was fortunate in my career to have people who were willing to take the time to teach me when I was just starting out. Not only Sam for the formal DIT training, but also countless assistants who answered my questions and showed me how a set is run.

Being tech-savvy is a must. I had a lot of experience with editing software and building computers, which helped me pick up DIT-specific tools faster. It’s important to understand how cameras work and how the files are encoded, as well as color theory and an understanding of how people perceive images. I had formal photography training that was very helpful for this. There are a lot of resources online, but there’s also a lot of misunderstanding that has spawned misinformation, so it’s important to read lots of sources. Read about logarithmic encoding, color spaces, bit depth, bayer patterns, 4:4:4 4:2:2 4:2:0, latitude, dynamic range, display gamma … and keep going from there. As for working on set and how digital cinema cameras work, nothing beats hands-on experience. Get on set any way you can, or get a job at a rental house, and be respectful of the work. You don’t want to get in anyone’s way, so wait for the right times to ask lots of questions and watch the ACs carefully. Being curious, helpful and kind will go very far.

Assimilate’s DIT Pack for Camera-to-Post Workflows

Assimilate has introduced its DIT Pack, a new product bundle that includes its advanced Scratch dailies software and Live Looks for live-grading single and multicam setups. The DIT Pack is designed for modern production workflows that require extensive previz on set to increase creative control and to streamline post workflows after the shoot.

The DIT Pack allows a seamless workflow that combines advanced live grading with dailies transcoding. Thanks to live streaming, it pushes everything out to remote and studio clients while capturing all camera metadata that will be used in VFX/post pipelines along the way. It’s available immediately on macOS and Windows.

The workflow starts with Live Looks for live-grading content from any number of cameras in real time. It’s also possible to add advanced effects like greenscreen background replacement and texture. When saving a grade, all the metadata — either input by the user or delivered via the live SDI signal from all cameras — will also be saved in the form of a readable text doc and an XML that is suited for further pipeline scripting in VFX/post. All grades and metadata are stored in an easy-to-approach folder structure that can simply be delivered to VFX/post.

In Scratch, all camera material is loaded, and the looks and metadata are matched and merged from the Live Looks folder structure, along with automated syncing of audio. All these tasks are automated and require almost no user interaction. Scratch will output in many formats as required by VFX/post:

– Offline DNX or Apple ProRes material, including all metadata for offline editing
– High-quality EXR VFX plates, including frame-based lens information and camera metadata
– Lightweight H.264/H.265 rushes for online review

The flexibility of Scratch allows the user to either consolidate all look and metadata files into a folder per day or copy the look and metadata next to each source media file, thereby easily linking in any other DI software. Producers and post supervisors receive an extensive clip report listing all relevant production information, and assistant editors get an ALE that contains all clip metadata, including dynamic CDL information to use in the target NLE.

At the live-grading stage, Live Looks provides a local web interface for clients on or near the set to browse grades, before-and-after snapshots and metadata via Wi-Fi during the shoot. At the same time, Live Looks allows streaming of all camera feeds through either an RTMP stream or via NDI directly into Zoom, Skype, SetStream.io or any other NDI-compatible software. At the dailies stage, Scratch separately outputs through RTMP and NDI for in-depth remote QC of all footage. After transcoding, Scratch allows for automatically publishing footage to the Assimilate Dailies Online web portal or to the COPRA dailies platform via direct script integration.

Assimilate’s DIT Pack is available at $1,399 for a permanent license, $799 annually or $99 monthly.

 

 

Rebel Fleet uses shutdown to train digital imaging pros

During the COVID-19 lockdown, digital imaging workflow specialists at Aukland, New Zealand’s The Rebel Fleet took the opportunity to train a new group in the art and science of digital imaging.

The Rebel Fleet provides all the services involved with on-set and near-set technology, monitoring, video playback, technical appraisal, backup, color grading and distributing the moving images that come off set each day. As digital cinema has grown, so have the challenges of keeping that data safe and of a high quality at lightning speed. The company specializes in these workflows.

Lockdown color pallete

“It’s a relatively new department in film. As such, not many people know about what we do, so there often is a lack of skilled digital imaging technicians, dailies managers and video operators here in New Zealand,” says The Rebel Fleet GM Michael Urban. “With all the work potentially coming up and the borders closed to many, we had to look at ways to be proactive in training the people we have here in New Zealand to meet the future needs of the industry.”

Throughout lockdown, the team created a module-based online learning curriculum specific to dailies, DIT, video and color roles. The aim was to structure skills and responsibilities and provide research, learning material and practical training to participants.

“The idea is that each of our almost 20 trainees feel empowered to teach as well as to learn, so each trainee is assigned a research topic and made a subject matter expert in one small thing. From there they present their research at one of our weekly meetings and really own the topic they have been researching,” says Isaac Spedding, who heads The Rebel Fleet’s training and development.

Because the industry relies heavily on practical learning and networking, The Rebel Fleet runs weekly “Senate” meetings, where specialists and trainees meet in the evening to learn about one aspect of the digital imaging and film industry. Senates have included topics such as creating a base color grade and advanced QTake operations, and there have been guest speakers talking about modern post workflows.

Once the lockdown in New Zealand was over, The Rebel Fleet turned attention to giving trainees practical, on-set experience. With feature film footage requiring a high level of privacy and often hidden behind an NDA, such shoots are not practical for teaching people about high-level production issues. So The Rebel Fleet decided to film content specifically for in-house training. The company created unique production days on which trainees take the reins to the whole process and get mentored by specialists. The Rebel Fleet and production partner Metro Film have run three practical production shoots so far with three different themes, the first being camera testing.

The second training shoot, for a short film called Bracken Road, included 35 crew and trainees with director Harriett Maire. It had three setups, an aerial unit and a full crew aimed at replicating feature film-quality workflows.

“The on-set environment is an ever-evolving beast, with many different moving parts that all need to work symbiotically. The equipment is just one of those beasts, with new and improving cameras and rigs plus different workflows and pipelines on a shoot-by-shoot basis,” says Rebel Fleet co-owner/senior colorist Pete Harrow. “With our training, we are emulating this in the most realistic way possible, with the best cameras available at present and some complex pipelines. We are also deliberately throwing in some technical issues and real-world problems to make the training days simulate some of the worst days on set.”

Bracken Road’s list of intentional technical issues included multiple cameras, the wrong white balance, dead pixels, sound rolling late and actors staring into the camera.

On the third shoot, called Lockdown, trainees were responsible for delivering a rich and beautiful short film. With a crew of over 40, including 15 trainees, the shoot replicated a high-end production pipeline with an estimated rental value of $22,000 on the floor. Two DIT carts and two QTake rigs were set up in parallel to allow as many people as possible time on the carts. Metro Film provided two fully kitted-out Alexa Mini packages, staff and all things camera. Fat Lighting provided a truck with RGBW LED lighting, including Skypanels. The crew was fully catered by Carwyn’s Catering and Craft Services. And industry veterans provided their valuable time and support.

The training shoots are valuable not just for relative novices but for those who are already working in the industry. Making the transition onto an international feature is a big leap when there are almost no stepping-stone productions in New Zealand on which to hone the craft. The training shoots aim to offer those stepping stones to get more people into more higher-paying roles and allow the industry to accept more locally crewed, high-end projects.

“As one of the department heads, I crewed the camera and lighting departments with experienced people, but in more senior roles than they usually occupy. For example, Bayley, our B camera operator, usually works as a focus puller, and her focus puller stepped up from her usual role as 2AC,” says DP Alex Campbell. “I can really see the advantages in this training program and think the visual quality that we delivered as a team bears that out.”

With productions starting up in September, it will be difficult to continue doing practical shoots like Lockdown without more industry support. Nevertheless, The Rebel Fleet has demonstrated a proven structure for training people into jobs that deviate from the traditional. The Rebel Fleet’s model enables a testing ground for ADs, camera operators, directors, actors, editors and post.

“When I was first approached by Isaac, I couldn’t quite believe the opportunity I was being offered. To be given a chance to create a piece of work with access to such a huge amount of expertise and resources? That’s the absolute dream,” says director Maire. “I’ve learned a lot about digital workflows on set that I didn’t have a true understanding of beforehand. Being a part of this process has helped my holistic knowledge and appreciation of what it takes to run a smooth set. That’s a really valuable understanding to have as a director.”

Main Image: Color grading tests for Lockdown.

DP Chat: Defending Jacob’s Jonathan Freeman

When the Apple TV+ crime drama Defending Jacob begins, viewers meet the seemingly perfect Barber family — assistant DA Andy, teacher Laurie and their teenage son, Jacob. Fairly quickly, things start falling apart after a local boy is found murdered in a park, and Jacob becomes the prime suspect.

Jonathan Freeman

Andy and Laurie both lose their jobs, and the family is ostracized as Jacob is presumed guilty before his trial even begins. The series, which stars Chris Evans, Michelle Dockery and Jaeden Martell, keeps viewers asking, “Did he or didn’t he?” until the very end.

For the most part, Defending Jacob takes place in winter, and the look of the show reflects that cold. To find out more about Defending Jacob’s look, we reached out to the show’s cinematographer, Jonathan Freeman, ASC, (Game of Thrones, Boardwalk Empire) to talk about working with the show’s director, Morten Tyldum, and showrunner Mark Bomback.

The show is set in an affluent suburb of Boston. Where did you shoot?
The series was shot in many of the locations that take place in the story. We were inspired by real locations and had tremendous support by our local crew. The lighting, grip and camera team worked extremely fast, often shooting the rehearsals. We rarely had to shoot a take again for technical reasons. I can honestly say it was one of the best production teams I’ve ever worked with. And our cast was phenomenal. Capturing performance was the most critical aspect of our storytelling.

What cameras did you use, and did you do camera tests?
We used the Panavision DXL2. We also tested the Sony Venice and ARRI Alexa LF (both beautiful cameras as well), but the DXL2 provided the most resolution, which was needed for Apple’s delivery, once the anamorphic image was unsqueezed.

Can you talk about shooting with multiple cameras?
Working on television shows like Game of Thrones and Boardwalk Empire, we had to achieve quite a lot in a short period of time. On GoT we shot only 10-hour days with almost no overtime, so I got used to shooting with multiple cameras. That experience helped me when capturing the scenes in Defending Jacob, which is primarily a character-driven story.

It was important for director Morten Tyldum and I to have as many simultaneously running cameras as possible in order to capture performances. Shooting this without it feeling like conventional television was a challenge because we often wanted the camera to be physically close to the characters; finding a second camera angle when shooting a close-up of an actor was sometimes difficult.

When we were not able to get a strong camera angle for the B camera, they would either pick up a detail of that same performance or prep for the next setup. This leapfrogging helped us immensely, but one key motif we frequently used the B camera for was shooting close-ups, where the camera was just a few inches higher than the character’s eyeline. It created a very intimate feeling — almost as if we were sharing the character’s perspective.

Can you talk about lenses?
These internal close-ups became a critical element in our storytelling. For Morten and me, the optical quality of the glass, the lenses, was paramount. We chose to shoot with anamorphic lenses. Even though we composed for a 2:1 aspect ratio, we wanted the benefits anamorphic provides aesthetically.

Since so much of our storytelling would be close-ups of our actors, anamorphic served three critical aspects. The anamorphic bokeh (out of focus distortion) became a skewed backdrop, a subtle depiction of their deteriorating world. It also smoothed out the inherent crispness of digital cinematography. And, frankly, it just looked more cinematic.

Panavision was extremely helpful in getting us the G series, which are particularly beautiful and unique in character. And Apple was very supportive throughout the process, working with us to ensure we kept the aesthetic vision Morten and I had while also delivering the highest-quality image.

You brought up the characters’ perspectives earlier. Can you expand on that?
Because the story is such an internal piece, Morten wanted the audience to experience the story through the characters’ eyes. We became very committed to POV. We referenced films like Michael Clayton, Mystic River and the films of Bergman and Polanski.

For every scene, we determined whose perspective we wanted to take. So in a scene with Andy, we might have shot with the camera close to him and potentially wrapping around him, over his shoulder, to see the rest of the scene play out from his perspective. We would often take the same approach with Laurie. But the critical difference that Morten wanted to convey was how the audience saw Jacob.

As the story unfolded, we wanted to create an enigma around him, just as the characters in our story start to wonder whether Jacob is guilty or innocent. We maintained a less subjective perspective with Jacob by keeping the camera more distant. If we did occasionally come in for a close-up, it was to capture another beautifully ambiguous performance by our actor playing Jacob, Jaeden Martell. We hoped this approach translated a sense of uncertainty for the audience.

Can you talk about the look and tone?
Mark Bomback’s scripts were so compelling. I read almost the entire eight hours in one sitting. Even though it was set in contemporary Boston, in the most familiar settings, it had a somber, elegiac quality to it — like a requiem. For the look and tone, we were inspired by Nordic paintings and the films of Bergman — a cool, wintery chiaroscuro light. To amplify a sense of isolation, we framed our characters against windows showing the world they were increasingly being separated from. We also shot our characters through layers of glass or partially obscured them from view using architecture, emphasizing their prison.

What about the lighting?
We wanted to take a naturalistic approach but with a slightly heightened reality — slightly expressionistic. So a cold, rainy day might be pushed toward cyan a bit more and the color desaturated. And since much of our storytelling would be conveyed by the performances of our brilliant actors, it was important to capture performance but also reflect that tone in their close-ups. Light might fall off to shadow more dynamically, but it was always critical to retain detail in the eyes of the actors.

Defending Jacob was the first production where I shot almost entirely with LEDs. The advancement of LED lighting has been a game-changer for me. I often use mini dimmer boards, where I can adjust the key and fill light ratio on the fly. This was more challenging when shooting with tungsten — as the light dimmed, the color temperature shifted warmer. Before LED, I wasn’t able to do the dynamic adjustments that I can now. It also means that I feel more comfortable shooting a rehearsal wherein I can adjust to the actors’ positions immediately without disrupting the set by tweaking between takes.

ARRI SkyPanels were the workhorses for our lighting, often bouncing them through book lights or lighting sections of our night exteriors. We also used Litepanels through diffusion as key or fill in tight spaces. My gaffer, Josh Dreyfus, introduced me to Quasar tubes, which became very versatile. We would use them in the standard way one would use tubes for lighting, but Josh and our key grip, Woody Bell, built substantial softboxes made of eight-foot Quasars, which we used instead of 18K HMIs through diffusion in cherry pickers. They weighed slightly less, drew less power, were aesthetically more pleasing, and were fully RGB and dimmable.

Talk about the color workflow.
When setting a look, I like to keep the variables to a minimum. By limiting the LUTs, I feel it helps reduce inconsistency across the workflow. Luckily, I had a fantastic team of people who translated the look that we captured on set down to the final color. DIT Nic Pasquariello and I established a few basic LUTs during testing and tweaked them slightly on set from scene to scene.

Jonathan Freeman

One was slightly cool, another slightly warm, but we made them all denser than the standard Rec. 709. I prefer to have darker LUTs, like rating the ASA of a film stock lower to get more exposure in a negative. This ensures that we were capturing more detail in the shadows, so when we got to the final color, we could “print down” most of the image but still extract information we wanted through power windows.

The workflow was seamless between our on-set look and dailies, which was graded by Rob Bessette from Finish Post in Boston. Rob and Nic were in constant communication, ensuring what we were seeing on set was delivered accurately to the editorial department. They were extremely consistent, which helped us greatly when it came to doing the final color timing with Joe Finley at Chainsaw in LA, with whom I have worked over numerous projects, including Game of Thrones.

Morten has a very strong eye, so for him, having great latitude in the color grade was as important as shooting, which was another reason why a dense capture was critical. One addition to the look that Morten made in post was creating a subtle color adjustment to the cool look we established in the dailies. He added yellow to the highlights, which gave it a gritty, almost aged quality and provided a color contrast to the overall cool tone.