NBCUni 9.5.23

Category Archives: On-Set Dailies

DP Linus Sandgren on Saltburn’s Shoot, Dailies and Color

By Iain Blair

Swedish cinematographer Linus Sandgren, ASC, has multiple award noms and wins under his belt, including an Oscar for his work on the retro-glamorous musical La La Land. His new film, Saltburn, couldn’t be more different.

Written and directed by Emerald Fennell, Saltburn is a dark, psychosexual thriller about desire, obsession and murder. It follows Oxford University student Oliver Quick (Barry Keoghan) who finds himself drawn into the world of the charming and aristocratic Felix Catton (Jacob Elordi), who invites him to Saltburn, his family’s sprawling estate, for a summer never to be forgotten.

Linus Sandgren

I spoke with Sandgren, whose credits include First Man, Babylon and American Hustle, about making the film and his workflow.

What was the appeal of doing this film?
It was two things. The script was brilliant; it was very suspenseful and exciting. I was drawn in by the buildup, how Emerald had it constructed, and I couldn’t stop reading. It was also very exciting for me because I hadn’t really done this type of film before. It was a unique story with a unique approach to this sort of psychopathic character — how you feel an affection for him, a sort of sympathy. It’s also so dark and funny.

I was also excited to talk to Emerald because of her work on Promising Young Woman, which I loved. Her directing of that film was excellent, and she was making very bold decisions. Then we had a call, and I was very impressed by her. She’s just so brilliant when she explains her vision, and you’re really drawn into her storytelling.

Tell us a bit about how you collaborated on finding the right look.
I typically don’t find the look [based on] different films. It’s more abstract than that, and a good approach is to just talk about it and see what words come up. Emerald said things like, “Desire or unachievable desire. Beauty and ugliness. Love and hate.” Suddenly you get images in your head, and one was of vampires. The family are like vampires, and Oliver is obviously a vampire who loves them so much he just wants to creep inside their skin and become them.

So there was some sort of metaphorical layer I was attracted to, and Emerald had a lot of vision already in terms of visual references — from Hitchcock movies about voyeurism to silent horror movies and Caravaggio paintings. We grounded it in some sort of gothic vampire core, but the story couldn’t just start there. We had to fool the audience a little bit and not explain that right away but have imagery that could be in that vein. The language was basically that the days could be sunny and bright and romantic, while the nights would be dangerous and dark and sexy. It was these discussions we had early on that inspired the lighting style and the compositions.

Tell us more about the composition.
Emerald wanted it to feel like the house was a dollhouse that we could peek into, and she wanted it to have a square format. It all made sense to me with that in mind, as well as the voyeuristic approach, where you focus on one singular thing more than if you go scope. It feels like you can see much more that way, so that allowed us to do things in a more painterly style. As soon as we started shooting that way, we knew we were right using an aspect ratio of 1.33×1 because we felt that we could be more expressive.

So compositions were a little bit as if you’re watching an oil painting, a classic type of composition, and we’d block the scenes within a frame like that without really cutting, or we’d go in really tight on something. It was sort of that “play with it a little bit” thing. Also, the approach is slightly artful more than cinematic. I feel like we thought of the shot list in another way here. It would be more, “How can we tell this story in a single shot, and do we need another shot, and if so, what is that?” Probably that’s just a really tight close-up. So we had a slightly different way of blocking the scenes compared to what I’ve done before. It was about creating that language, and the more you nail it before you shoot, then it solves itself while you start working on each scene.

What camera setup did you use, and what lenses?
We shot Super 35mm film in a 1.33×1 aspect ratio, which is the silent aspect ratio. We used Panavision Panaflex Millenium XL2s. It’s the same as silent movies, basically, for perf, and we used Panavision Primo prime lenses.

Did you work with your usual colorist Matt Wallach in prep?
Yes, the team was Matt Wallach (Company 3 LA) and dailies colorist Doychin Margoevski (Company 3 London). The dailies software was Colorfront’s On-Set Dailies. I have worked with Matt on dailies for many movies and lately in the DI. We set this up together, but he wasn’t able to come over to London to do the dailies, so he was involved remotely and was watching stills from the dailies Doychin did.

Tell us about your workflow and how it impacts your work on the shoot.
My workflow is always that the film gets scanned, in this case at Cinelab in London, and then developed and scanned in 4K. So it’s a final scan from the beginning, and we don’t touch the negative again. Then it goes to Company 3 for dailies. But before the dailies are distributed, the colorist sends me stills from his grading suite in dailies so I can look at the color. It’s just a few stills from the different scenes, and takes a week or two for us to dial it in. Matt gets the footage; he uses his instinct, and we apply a Kodak print emulation LUT. Then he works with the printer lights to see where he has the footage, and he does what he feels is right, with perhaps contrast or lower blacks.

He then tells me what he did, and we look at it on the stills he sends me. That’s when I’ll say go a little colder or darker or brighter or whatever. But usually after a few days we dial it in and get the look down. But, as I said, we spend a little more time in the beginning to make sure we have it right, and it also has to do with me knowing that we’re doing the right thing with the lighting — perhaps I’ll need to add more light for the next scene.

This has been our way of working since Joy in 2015, which was the first thing Matt and I worked on together with dailies. That process is really good because nowadays the iPad is like P3 color space, and it looks really good when you have it at a certain exposure, and that becomes our look. That’s why it’s so important to set that in the dailies because once we’re in the DI, I don’t want to change it. I just want to adjust things, like match the shots to each other or fix a face or do something else without changing this sort of look. The look should be there already.

That’s what I like about film too; it adds something to it. I feel like I know exactly how it’s going to look, but it looks 5% better or different with film because it gives me things that pressure me when I see it. It’s like, oh, look at the halation there, or look at those blue shadows. There’s something always going on that’s hard to actually imagine, as you don’t see it with your eyes, even if you know it’s going to be there. So that’s a nice thing. Basically, if you looked at the dailies on any of my previous films, I didn’t touch it much. That’s why I usually like having the same colorist do the dailies as the DI, but it couldn’t be helped on this one.

Dailies colorist Doychin Margoevski was great. He’s also got a great eye for darkness, and he’s not afraid of letting it be dark. So as I noted, the three of us dialed it in together initially, and then he sent stills to me and Matt, and we looked at them. That way, Matt was very familiar with the footage when we came to the DI, and he’s used to being with a timer and keeping track on the whole project. Matt also did the trailers, so all that is solid control.

I heard that you shot all the stately home interiors on location at just one house?
Yes, it was a 47-day shoot, all done in the one country house and in a nearby country estate for some exteriors, like the bridge scene. Otherwise, all the exteriors and interiors are at the same house. Then we shot at Oxford and near Oxford for some interiors, and then London. We built only one set, which was the bathroom. That was built inside of a room, and the two rooms next to it were Oliver’s and Felix’s bedrooms. They were completely painted and dressed and made up as their rooms, as they didn’t look that way at all when we came in. It’s the red corridor that was important going into the bathroom, and then the bathroom and then the rooms.

 

I assume the huge maze was mostly all VFX?
Yes, the whole maze is visual effects combined with the practical. When we’re down there walking around, it’s all practical, and we had these hedge walls that were moved around so we could get through. The center of the maze with that big statue in the middle was built by production designer Suzie Davies and her team. It was all VFX for the big, wide exterior overhead maze shot and the wide shot from the windows. VFX supervisor Dillan Nicholls and Union did all the effects.

What was the most difficult scene to shoot and why?
That’s a good question. I think the scenes of Oliver’s party. We had to be careful with the property, so we couldn’t drive around too many condors or cherry pickers, and we had to shoot different scenes over a few nights all over the place — from one end of the house to another end of the garden. We would be inside of the maze and outside at the discotheque or inside at the red staircase. And all of that had to be prelit to work 360, basically.

It was daunting to light, but we could eventually position lights and condors and sneak them in from other angles. So it was a little complicated. We had to plan it out, but thanks to the really good special effects department, we could fog it all up. Suzie Davies helped with fire flames so we could send practical lights in there to make it all look like a big party.

Are you happy with the way it turned out?
Yes, I’m really proud of it. It’s a special film for sure, and it was a really fun shoot… and different. It’s so refreshing to have a director that dares to do what you think is right, just the way you want to, so you don’t have to restrict yourself. I love working with Emerald. She’s very fun and, I think, brilliant.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Scoop

FutureWorks Uses Next-Gen Color Workflow on Netflix’s Scoop

Mumbai, India’s FutureWorks created a new pipeline for Netflix India courtroom drama Scoop, taking imaging data all the way from the set to the edit suite. As well as creating a new workflow that boosts efficiency while ensuring quality visuals, FutureWorks also covered the entire picture post process and rental services on Scoop – including dallies, the online, grade, VFX and finish.

Produced by Matchbox Shots, the Hindi-language series was directed by Hansal Mehta, with Pratham Mehta serving as director of photography and Michele Ricossa as lead colorist. The show follows a prominent crime journalist implicated in the murder of a rival reporter.

ScoopFutureWorks began to develop this workflow following its live color grading work on the 2022 film Jersey. Based on the challenges experienced on Jersey and taking advantage of the lull in productions during the pandemic, the studio started to evolve its DIT process to enable the team to work more efficiently. This is particularly important on-set. The aim was to empower the colorist to work with the DP and director while they’re still on set, so that any issues could be flagged before reaching the edit suite.

“We needed a process that would support everybody,” says Rahul Purav, head of color at FutureWorks. “So, we started to think about extending the imaging process beyond just color. We focused on creating an on-set monitoring process, as well as QC.”

“With Rahul, we managed the workflow of the DIT setup prior to the shooting,” explains Ricossa. “I was on set the first two or three days of the shoot to check with the DP and the DIT team on how to work on the dailies and if the look was working as intended. After that, I had a short session at FutureWorks to review some of the footage on the HDR setup. A few weeks later I went on-set one more time to a different location to check if everything was holding up properly look wise. Then, I reviewed the dailies on the private cloud streaming service, giving minor notes to the DP and to the DIT team. Everything went smoothly.”

Ricossa graded Scoop using FilmLight’s colorspace T-log/E-gamut. Only VFX shots were converted in ACES to facilitate the VFX workflow. “The look of the show, and the grade of individual clips later on, was shared with the VFX team thanks to the Baselight’s BLG system. When possible, we had a few back and forths between the DI and the VFX to fix issues and get the best out of it,” explains Ricossa. “The most challenging part of the show was to match some of the stock footage. Baselight’s tools and color management system helped a lot to achieve the grade I had in mind.”

In FutureWorks’ workflow, everything is done remotely. During shooting, the systems record and monitor video signals wirelessly. This means that there’s no distraction or interruption for the cinematographer, but when required, they can talk to FutureWorks’ on-set DIT who has a studio-grade monitor under their control. This helps the directorial team to verify that everything they’re shooting is correct, while they’re still on-set. This includes color, but also extends to other areas like lenses, focus, and exposure levels.

Transcoding is monitored throughout the process to highlight any areas of concern. If there are issues, such as reflections or unwanted props in the shot, these can be dealt with at the time of shooting or flagged for fixing by the VFX team. Everything is captured as an Movie file with embedded metadata so that all of the data from the shoot ends up with editorial. “It’s like there’s a third eye watching you and helping you while you’re shooting and editing,” says Rahul. “As a colorist, I think it’s imperative that everybody in the chain is aware of what’s happening on the shoot, right from the beginning to the very end. This makes communication much more efficient, as notes from the cinematographer can be embedded into the metadata of the particular shot, which is very helpful later on in the process.”

This new process was absolutely key for the shoot on Scoop, which lasted for 100 days. A team of four people tested the system first, with FutureWorks having since streamlined the crew to three — one experienced DIT technician for on-set QC and another two for data management. All team members are very experienced and have trained for a long time so that they can integrate with each other on the shoot, ensuring that all the necessary data is captured and transcoded.

“When you take that experience on location, it’s an asset to the cinematographer, the director, and the production as a whole,” explains Purav. “Throughout the shoot on Scoop, the director and cinematographer continually came over to verify shots on the imaging cart, demonstrating that our new pipeline is already proving to be useful for the directorial team.”

The revamped pipeline — which had to meet the specifications required by Netflix productions — includes Livegrade Studio, Codex and Silverstack for transcoding, and FilmLight Daylight for rendering dailies. One of the key challenges in implementing the new workflow was understanding the protocols of each camera. If certain protocols didn’t work with the new system, the team had to find different ways to sync the data. FutureWorks also collaborated closely with manufacturers and vendors, including Sony and Codex, to troubleshoot any problems.

“While we had a few teething problems initially, we were able to work them out within a couple of days, and it was smooth sailing from that point on,” says Purav.

Since its successful debut on Scoop, FutureWorks has rolled out the new imaging process on several other projects.

 

NBCUni 9.5.23

NAB 2023: Colorfront Demos 8K/HDR Streaming Features

During NAB, Colorfront was showing its Transkoder and streaming and dailies systems at the Conrad Hotel during NAB. The company was also presenting “live” streaming of 8K footage using only hotel broadband in a setup that features Colorfront Transkoder running in AWS. The solution was reading, debayering and processing 8K Sony Venice 2 and Red RAW camera footage directly from AWS S3 cloud object storage and streaming 8K video to Colorfront Streaming Player. The Streaming Player is running on a Mac Mini with an HDMI 2.1 video connection, displaying the results in HDR on an 8K LG 77-inch OLED TV.

Streaming
Colorfront Streaming Player is video receiver software for critical remote viewing, with support for professional video I/O devices from AJA and Blackmagic Design. Streaming Player has undergone significant improvements, including the introduction of a shared pointer feature with minimal latency for real-time collaboration and a new browser-based viewer option for optimized user experience.

Additionally, there are two streaming server products available: the compact, 1RU Colorfront Streaming Server appliance and the software-only Colorfront Streaming Server Mini for Mac/PC laptops and workstations. The Advanced Streaming Gateway and Stream Manager ensure secure, predictable streaming to multiple destinations. With streaming built into all Colorfront products, the company’s On-Set Dailies, Transkoder and QC Player applications can all stream in 8K, whether locally or from the cloud.

Colorfront streaming products have been audited for security and certified by several major Hollywood studios to securely stream prerelease content via AES 256-bit encryption, enterprise single sign-on user authentication, with the ability to kick out individual viewers. Security features also include forensic audio/video watermarking, visible spoilers and burn-ins of individual IP addressed and session IDs.

Along with 8K/HDR capability, the latest enhancements include AWS Cloud Digital Interface (CDI) input for AWS cloud workflows; an NDI interface that allows creative digital artists to live-stream material directly from popular NLE, compositing and color grading systems without needing additional hardware; built-in Dolby Vision content-mapping with 4.0 and 5.1 metadata; and a host of new tools to manage invitations and assign users to streams.

Colorfront Streaming Player was recently deployed in The Culver Theater, the first US theater to showcase a Samsung 8K HDR Micro LED IWA wall.

“We are thrilled to now have the ability to play out 8K camera RAW files to our 8K HDR cinema screens directly from our Virtual Private Cloud in AWS,” says Jonathon Lee, head of media engineering and innovation at Amazon Studios. “This has created a new paradigm for us and helps close the loop with our global, pure-cloud production environment. We can now post our shows anywhere in the world in full fidelity.”

At NAB 2023, Colorfront is showing its Transkoder mastering, QC and deliverables solution running on a Mac Studio desktop with dual 32-inch Retina 6K Pro Display XDR monitors and featuring Colorfront’s new second-head Analyzer. In this powerful, cost-effective configuration, Colorfront Transkoder optimizes DCP/IMF mastering workflows, accelerates HEVC H.264/H.265 and ProRes read/write, and delivers rapid handling of all new RAW camera formats.

The newest version of Transkoder also supports Dolby Vision 5.1, with a focus on metadata creation and trims for cinema targets as well as Dolby Vision validation. Additional capabilities include Dolby L1 check, side-by-side split view for SDR and HDR, and additional markers for 48-nit checks.

Colorfront is highlighting a host of additional productivity and user-experience enhancements to Transkoder, such as support for the latest DCP/IMF packages, high-throughput JPEG 2000 (HTJ2K), the OpenTimelineIO interchange format, an API for editorial cut information, new playhead controls, improved image analysis and HDR reporting, and auto-detection of blanking issues.

On the audio side, Transkoder supports Dolby Atmos and comes with new dialogue-gated loudness measurement and PDF reporting tools, greater abilities to manipulate individual channels in multi-channel audio tracks, as well as timeline editing and enhancements.

Transkoder integrates automatic speech detection, enabling AI to decipher spoken content and extract the corresponding text. This allows users to overlay text as subtitles or export it for other purposes, ultimately streamlining the workflow. For cloud-based workflows, Transkoder provides ultra-low-latency uncompressed video output via AWS CDI as well as NDI.

Colorfront On-Set Dailies and Express Dailies
Colorfront’s On-Set Dailies and Express Dailies are both updated to support the latest digital camera formats, including ARRI Alexa 35, Red V-Raptor 8K VV, Sony Venice 2 8.6K and Blackmagic Design 12K RAW. It also supports ACES 1.3, the latest Academy Color Encoding System, including ACES Metadata Files (AMF).

Support for third-party OpenFX has been extended to embrace Sapphire VFX plugins, which, along with Invizigrain and LiveGrain plugins, enable users to add textural qualities to content. There are also new tools for handling/retiming high-frame-rate clips, improved rendering and enhanced custom burn-in.


Autodesk Acquires Moxion’s Cloud Platform for Dailies

Autodesk has acquired Moxion, the New Zealand-based developer of a cloud-based platform for digital dailies. The solution has been used on such productions as The Midnight Sky, The Marvelous Mrs. Maisel and The Matrix Resurrections. According to Autodesk, the acquisition of Moxion’s talent and technology will expand Autodesk’s own cloud platform for media and entertainment, “moving it beyond post into production, bringing new users to Autodesk while helping better integrate processes across the entire content production chain.”

Moxion’s platform enables professionals to collaborate and review camera footage on-set and remotely with the immediacy required to make creative decisions during principal photography in 4K HDR quality and with studio-grade security. Moxion ensures data security with features like MPAA compliance, multi-factor authentication, visible and invisible forensic watermarking and full digital rights management.

Founded in 2015, Moxion has been awarded with an Engineering Excellence Award from the Hollywood Professional Association (HPA), a Workflow Systems Medal from the Society of Motion Picture and Television Engineers (SMPTE) and a Lumiere Award from the Advanced Imaging Society.

“As the content demand continues to boom with pressure on creators to do more for less, this acquisition helps us facilitate broader collaboration and communication and drive greater efficiencies in the production process, saving time and money,” says Diana Colella, SVP Media and Entertainment, Autodesk. “Moxion accelerates our vision for production in the cloud, building on our recent acquisition of Tangent Labs.”

Aaron Morton, a cinematographer who has worked on projects including Orphan Black, Black Mirror, American Gods and Amazon’s new The Lord of the Rings series, used Moxion for several projects. “It’s never fun when decisions are being formed about your work if the dailies aren’t the way you wanted them to look,” explains Morton, NZCS. “With Moxion, it’s what I see on the set, and the decisions I make with the dailies colorist always play out so that production people and producers are seeing what I want them to see. The images are very true to what we see while we’re shooting.”

 


Light Iron

Light Iron Up Leadership Team: New Hires, Promotions 

Light Iron, a Panavision company that provides post creative services, has bolstered its executive leadership team with recent promotions and hires.

In an evolution of her role with the company’s leadership team, Light Iron cofounder Katie Fellion has been promoted to SVP, business development and post production strategy. She has been with the company since it began operations in 2009 and was instrumental in establishing Light Iron’s Outpost mobile dailies systems. In her time with Light Iron, Fellion has also produced several firsts in file-based finishing, including the first 6K DI, the first studio feature cut on Final Cut Pro X and Amazon’s first HDR series. In her new role, she is responsible for global sales and business development, new market strategies, and strategic alignment with Panavision so clients can maximize the value proposition of the companies’ shared production-to-post offerings.

In addition, Megan Marquis has been promoted to VP of operations for Light Iron Los Angeles. She joined Light Iron in 2013 as a senior producer at the company’s New York location where she handled budgeting and oversaw projects from dailies through final delivery. After Light Iron’s acquisition by Panavision, she began to integrate resources between the two companies, assisting cinematographers in camera tests and joining conversations about production choices to smoothly translate decisions to the post side. With her latest promotion, Marquis is responsible for managing the overall operations of Light Iron’s two facilities in Los Angeles.

With more than 20 years of experience developing file-based workflows, best practices and future roadmaps for digital motion imaging, Eric Camp joins the company as director of operations for Light Iron New York. Camp brings wide-ranging expertise to the position and firsthand understanding of the client’s point of view, having worked in both production and post in a variety of creative and operational roles, including crew positions on 15 features and over 250 television episodes. With Light Iron, Camp oversees all of the New York facility’s operations, including the location’s on-site and remote offline editorial rental offerings.

Ken Lebre joins Light Iron as director of dailies, bringing experience in streamlining workflows with creative and technical talent. Through his background with boutique shops and larger post facilities — where he’s held positions including director of operations, senior director of client services and GM — he’s become an expert in dailies workflows while simultaneously interfacing with studios and production companies. At Light Iron, Lebre oversees the company’s global dailies operations, including at its regional facilities and its primary hubs in Los Angeles and New York, and for all near-set and remote deployments domestically and internationally.

In addition to Light Iron’s New York and Los Angeles facilities, each of which offers the full breadth of creative finishing services, the company also has locations in Atlanta, Albuquerque, Chicago, New Orleans, Toronto and Vancouver, offering dailies services and remote sessions. Light Iron’s unparalleled remote capabilities — including solutions for dailies, offline editorial rentals, and color and finishing — open the doors for filmmakers working anywhere in the world to partner with the company.

 

 


DITs and On-Set Storage

By Beth Marchant

In the world of on-set data management, having a talented DIT on set is an essential part of any successful production. The DIT’s roaming cart or more permanent setup in video village is a critical point in the pipeline, where dailies are screened and readied for the edit and additional post.

It can take an arsenal of solid state and spinning hard drives, computer hardware and monitors, cables and ports to grab all the camera and other data being generated on today’s sets, then swiftly copy, transcode, and share it with multiple post and archival departments. Depending on the camera and post workflow, how a DIT does all of this can vary widely from project to project.

On-Set Storage

B Kelley

Here, DITs B Kelley and Carmen Del Toro share their top tips for data-minding on set.

B Kelley
B Kelley — a data-based DIT, underwater camera operator, Phantom technician and former child actor who directs their own short films — is used to the pressure. “No matter what side they are coming from on set, they see the DIT as the bottleneck,” says Kelley, who often works in commercial production. “You have all this prep and production going into the DIT, who then sends it all out to post.” You learn quickly, says Kelley, to be flexible and change up your questions depending on the crew member or department. “If you’re talking to the DP, you can say, ‘OK, here are our storage needs and here are our speed needs, so that I can keep up with you and you aren’t slowed down by me.’”

Convincing production to invest in decent hard drives that will keep the bottleneck from happening and live up to expectations, however, is always a challenge. Most production companies don’t spend much time or money thinking about optimal storage workflows. “When I show up on a job and I’m told, ‘Oh, the production company has its own hard drives,’ this usually gets an eye roll. What that often means is they have awful hard drives that they spent next to nothing on.”

On-Set Storage

B Kelley’s DIT setup

But Kelley has also been pleasantly surprised, like the time they arrived on set to find a fleet of OWC ThunderBlade RAIDs ready to put into action. “They were very, very fast. It just makes sense to have a lot of very fast, reliable hard drives that you can send out with the crew on productions. I always recommend these drives to the commercial production companies I work with. They will pay for themselves after three jobs.”

Kelley says OWC drives, like those 16TB ThunderBlades, are a rare solid option in what once was a more crowded market. “We used to have a lot more to choose from, but it seems that in general, the industry has eliminated the middle-ground HDD spinning disc hard drives. G-RAID, which used to be the standard, was bought by Western Digital, who changed their usability.”

For smaller storage needs that still need speed, Kelley’s go-to is the SanDisk Extreme Pro portable bus-powered SSD. “They are a reliable and affordable option for shuttle and transcoding drives,” Kelley says.

B Kelley

In addition to Red cameras and the Sony Venice, “all flavors” of the ARRI Alexa reign supreme on Kelley’s commercial shoots, especially the Mini and the large-format Mini LF. Kelley prefers the intraframe compression of ProRes to prepare files quickly and painlessly for the edit. “On set, the Mac is king, and as a DIT, I always try to push people toward ProRes,” Kelley says. “There are some cases, however, when you really need that extra latitude, and Raw is more relevant. But most of the time, shooting Raw doesn’t really make sense from a time and money standpoint, given what you’re actually shooting.” On Sony Venice shoots, Kelley says X-OCN, Sony’s version of Raw, is the default codec.

Although Kelley says some of the best DITs working today are those doing on-set color grading on feature films, a data-centric approach is well-suited to commercial and live-event production shoots. “At the higher level, DITs are most sought-after for their color abilities,” Kelley says. “I take more of a storage workflow design approach, so I am very much geared toward the total workflow. I would much rather deal with a seven-camera shoot than live-grade a three-camera shoot.”

B Kelley’s cart

For the Billboard Women in Music Awards, Kelley managed input from 27 cameras at once. “That was an insane, wild night. There were performances, and there were awards given. I ended up going back to the Billboard offices and co-opting 10 of their computers.”

For three years running Kelley has also worked on the Billboard Hot 100 Music Festival at Jones Beach in New York. “We used somewhere between 250 cards in a three-day span,” Kelley says. “In the last year I did the festival, I had four computers running off of a server to try and keep up. That was an intense experience and such a great workflow. I had six editors sitting in the trailer with me taking what I was downloading live and cutting it together. It was a challenge but also a complete thrill. I loved it.”

Kelley says that when downloading footage, every DIT should run a common software calculation called a checksum that compares material on your original media to the copies for other departments. “Obviously, you scrub through the footage and make sure everything’s good, but you always should be using some sort of software that uses this checksum so that not only are you protected, but you can point to the checksum in the report and say, bit-by-bit and software-verified, that everything was there,” Kelley says.

“It’s worth taking the extra amount of time to get the checksum rather than trying to deliver a faster file. This is why I find it crazy when they push off the storage and downloads onto a less experienced person,” who might mistakenly think speed is the only objective when prepping and delivering files.

Although the pandemic pushed clients to embrace more remote streaming in all aspects of production, Kelley says clients’ actual storage workflows haven’t changed much in the past two years. “What has changed, because of more remote streaming, is the number of people on set,” the DIT says. “Before, too many people on set would really slow us down. But having them on a stream connected to one person, who’s the voice of the client, is a much more seamless workflow. I really do hope this has changed for good.”

Other projects that Kelley has worked on include L’Oreal/Younger, Patrón Tequila and Red Carpet Makeup.

On-Set Storage

Carmen Del Toro

Carmen Del Toro
For Carmen Del Toro, aka “Data Lady,” creating optimal conditions for every piece of data-minding hardware on set should be the primary goal. “It is so important that the negative is in a stable environment,” she says. “I like to use anything that’s nonvolatile, like NVMe SSD memory storage and solid state drives, when the production has the budget. The more time you spend downloading data, the more all the components heat up, and there is a larger risk of failure. But if you have fast storage, transfers happen quicker, and the devices — computer, media, drives, readers, hubs, etc. — are not in extreme-heat situations for a prolonged time.”

Her friend Dane Brehm, a DIT at Cintegral, first introduced her to NVMe storage. “Dane is a mad, happy genius about everything storage and has this great motto: ‘Fast drives save lives.’ I think this is so true,” she says. “Just yesterday, I sent a list to the Apple TV+ show I’m starting next week in LA, a sci-fi/fantasy shot on greenscreen for preteens. They asked me, ‘How much time do you need after wrap?’” That’s when she knows she needs to push for buying faster cards and drives up front. “If it’s up to them, they would buy [cards with] 165Mb transfer speeds with a regular USB 3.0. So when we show up, we work for 12 or 14 hours, and then a DIT like me or the loader gets to stay to offload that last card that they’re shooting.” Slow, unreliable cards and drives make everything drag on. “On set they’re saying things like, “OK, one more, just keep rolling. Let’s let it go. Let’s reset. Reset. Reset.” What could have been a 250-gig offload becomes a 700-gig offload.”

Starz’s Shining Vale

At that point, adds Del Toro, it’s not just the DIT or the loader left standing around. “It’s the teamster. It’s the set medic, the PAs, the second AD. Everybody who’s waiting for that data sheet at the end of the night. By the time you get home, you’ve been out for 17 hours. I have the varicose veins to prove it! I push for productions to get out of that space because they think they’re saving money, but they could easily put that overtime money into faster hard drives. If they say, ‘I thought you guys liked overtime,’ I always say, ‘Well, I like life better!’”

When she can’t “do what I need to do in a half hour or less” after wrap, Del Toro gets creative. “I try to do crazy things, like individual camera offloads. It costs a little bit more money because I get two readers, and I am supposed to be reading two different cards to the same drive. I have an A camera drive and a B camera drive, and I send to two drives on my breaks.”

Carmen Del Toro on set of I Know What You Did Last Summer

Should the production company ask her for advice on which drives to buy, “I try to push OWC because I think their products are reliable,” she says. “They have a good warranty. If they fail, OWC is pretty good at replacing them quickly. If they say they are too expensive and ask for another option, I suggest Glyph drives, which are more affordable. This is the thing about hard drives. No two are alike, and there could be a snafu somewhere. A cable could be soldered wrong, for example, or it suddenly stops connecting.”

Del Toro has been using Glyph tools , which transfer at about 900Mb/s, on a number of recent episodic shows shot with the Sony Venice. “The camera’s readers only go up to like 600MB/s or something,” she says. “The drive will still be faster, or it will pair up. There’s no lag or loss.” Del Toro’s cart workstation is based on a modified Apple Mac Pro “trash can” Xeon CPU. She recently added a 55-inch OLED monitor that she hopes her grip colleagues can help securely mount to her cart.

After running cables through extreme conditions on location for Netflix’s Narcos for three years, Del Toro says she now prefers to work on a sound stage. “I loved working amid the chaos of shows like that,” she says, “but now I find working in the studio so much better. I’ve run 200-foot lines through the Atacama Desert, buried them in the sand, then had a timer on to go and unbury them with ice because I didn’t want the cords to melt. I’ve worked in caves where, when I opened my rack, everything was calcified. Also in countries where there’s no RF control, and the video image suddenly disappears. I had to be the person saying, ‘We’ve got to do it like we did in 1985. Only the guy looking through the camera can see. So don’t fuck it up!’

I Know What You Did Last Summer

Del Toro, like Kelley, has several parallel talents beyond her current role as a union IATSE Local 600 DIT. An art history major turned assistant editor, she spent her early career in post before transitioning to production in 2010. “When the HD camera came out, somebody in the post house where I worked needed to figure out how to convert HD to SD and use AJA Kona cards,” she says. “I read a lot of manuals, and then I got introduced to the Avid Symphony Nitris. There was a company that did what they call dead-end color correction for reality TV shows. They would color on the Nitris and then print out to tape.”

She was hooked, and now she does live color grading on nearly every project. She’s also a self-taught and FAA-certified drone pilot. “That’s a dream of mine, to just do color work — and fly my drone all the time,” she says.

Recent projects for Del Toro include I Know What You Did Last Summer, Them and Shining Vale.


Beth Marchant, a former staff and contributing editor of StudioDaily.com, writes about entertainment technology and craft for The Los Angeles Times and IndieWire. Follow her on Twitter@bethmarchant.

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


Alice Brooks Talks Dailies

In the Heights DP Alice Brooks Talks Dailies, LUTs and Look

By Iain Blair

Cinematographer Alice Brooks is having a monster year. First was her inspired work on In the Heights, the film adaptation of the Lin-Manuel Miranda musical directed by Jon M. Chu (Crazy Rich Asians). Then next up is Tick, Tick…Boom! for Netflix and Imagine Entertainment, which comes out later this year. Also a musical, it’s helmed by Miranda, in his feature directorial debut, and produced by Ron Howard.

Alice Brooks Talks Dailies

Alice Brooks on set with director Jon Chu

Brooks (Queen Bees, Home Before Dark and The LXD: The Legion of Extraordinary Dancers), has worked with Chu since their days as film students at the University of Southern California nearly 20 years ago. Brooks describes filming In the Heights as the highlight of their long collaboration and her individual career. “Shooting it was my sueñito. It was the most magical summer of my career as a cinematographer.”

I talked to Brooks about the challenges of shooting In the Heights, the cinematography, lighting and working with the DIT, DI and visual effects.

Turning a stage musical into a film can be challenging. How did you approach the look of the movie?
There were so many challenges, the main one being, how to you connect emotionally to the characters in a very real way. Jon stressed that it had to be about real people in a real neighborhood with real dreams, and that’s how we began developing our visual language. So sometimes it would be very intimate, and then we’d have these grand musical numbers.

Alice Brooks Talks Dailies

Alice Brooks

How long was the prep and shoot?
I had a 10-week prep, and then we shot for 49 days — 10 on stage and the rest on location.

How did you make all your camera and lens choices?
I did several rounds of tests. I went to ARRI and Panavision and tested different systems. I got to look at an early ARRI Mini LF but it wasn’t quite ready for us to use, so we ended up shooting on the Panavision DXL2.

We wanted to shoot anamorphic, so we tested lots of different lenses, and it’s really an intuitive thing for me. I spent a good month testing different systems, going back and forth and projecting stuff, being immersed in the story and dancing and so on, until I decided on the right package.

Did you work with a colorist in prep on any LUTs?
Yes, we did all our dailies and then the DI through Company 3 in New York, and I worked very closely with the great dailies colorist Dustin Wadsworth in conjunction with Stephen Nakamura, who did the final DI. So I’d take all the tests into Dustin and we spent time playing with different parts of the frame and how the lenses rendered the color naturally with the camera and in-camera LUT. Then we’d start to tweak the LUT. Ultimately, we used a slightly tweaked version of the Light Iron film LUT, which comes with the camera.

Alice Brooks Talks DailiesDid you have a DIT on set?
Yes, Bjorn Jackson, and I loved working with him. We did CDLs for different scenes, but nothing too aggressive. I loved to watch the light during prep. We began prepping the movie in winter and when we got to spring, New York suddenly felt warm, and I could start to feel the quality and color of the light that ended up being what our summer would look like in Washington Heights.

It’s a very specific look and light, I think because the bricks are more silver-gray there, and it’s very small and sandwiched between two bodies of water, and you get a much cooler light and cooler shadows than the rest of Manhattan. So while we were doing our CDLs and then the DI, I kept stressing, “This is not an orange tropical summer look. You still feel the heat, but it doesn’t come from a warm color.”

How did you do the big opening sequences technically? Doesn’t New York have a strict no drone policy?
You’re right, and that really restricted what we could do. We had some helicopter shots in the opening, but not being able to fly drones posed a lot of problems, especially for the Busby Berkley-style stuff, as shooting with a drone would have been so simple.

It was the same with the big pool scene, which was so challenging without a drone. There were all these tunnels there under the pool deck, which really restricted the weight of the crane we could use, and we also had to deal with all the underwater stuff.

How did you deal with the subway scenes?
That was so intense! Jon wanted to create this elegant immigrant journey — shot like a painting and a ballet — and he wanted this to be the only thing in the whole movie that could possibly be theatrical. So we scouted all these theaters and event spaces, but nothing felt right until we had the idea of making a character’s (Vanessa’s) journey using subway cars.

Alice Brooks Talks DailiesAnd then finding a platform was very difficult until our location manager and the MTA came up with an abandoned platform in Brooklyn on the D Avenue line. No one had been allowed to shoot there before. It was three stories underground with no elevator, and it was very dark and very hot, and we also had to bring all the equipment down and do a huge amount of cabling and rigging to light the cars and platforms. So it gave us this huge space, but it was like a week of lighting, and we only had one day to shoot it all.

Then we did the graffiti tunnel bit at 191st Street in Washington Heights on the hottest day — 110 degrees — of the whole schedule. It was also incredibly humid, and we had just a few hours at night to light and shoot it all. Amazingly, the walls were sweating with moisture, which gave us all these beautiful reflections down the tunnels. It was this happy accident that made it visually even better than I could have imagined.

How involved were you in all the VFX?
I’m very lucky since Jon, picture editor Myron Kerstein and I are a great team. And since I was in New York, I got to see all the VFX reviews; Company 3 would show me how things were progressing. VFX super Mark Russell was great to work with and very collaborative, so any notes I had were listened to, and all my ideas were incorporated. The place it shows the most is the scene where the sun goes down. The sun was setting in the initial CG environment — we’d shot it for real, but when you’re looking east, it was full-CG, not plates — and I said, “When the sun sets in New York, the light never comes from just one direction because of all the reflective surfaces, so it should be bouncing off windows and cars and so on.” So we added in all those elements, and the scene came to life.

Tell us about the DI at Company 3 with colorist Stephen Nakamura. What was entailed?
When I began it in February 2020, I was in the middle of prep on Tick, Tick…Boom! in New York, so we’d meet either at night or on weekends, and then he’d work on my notes. We planned to take a break, as VFX still had a lot of shots to turn in, but then COVID hit, and we got shut down. That ended up being a good thing for the film, as I just felt something wasn’t quite clicking, and it gave me six months to really think about the DI.

So I didn’t go back till August, and we did two more weeks on the DI, and that ended up being remote. Stephen was at Company 3 in LA, but Jon couldn’t even get in there because of COVID. I was in New York on a high-speed live feed with Stephen, so I could see a [Resolve] Power Window in real time, and that’s how we did it.

When we restarted the DI, I told Stephen I didn’t want to start at the beginning of the film, but with the big Carnaval sequence, which is the heart and soul of the whole film. I felt if we could get that color right, then the rest would fall into place. But it was hard to do, as we shot a lot of day exteriors. Carnaval was shot over the course of 14 hours, and it has to feel like 8 minutes. But when we finally got it, and all the skin tones and coolness of the shadows felt right, and we’d pulled out saturation, then we went back to the opening number. We used Carnaval as a reference and kept checking it for tone, and then it all worked great.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Podcast 12.4

DIT Chat: Tyler Isaacson on Workflow, Best Practices

Over the years, Tyler Isaacson worked his way up the production ladder, first from PA to camera PA. Then, to broaden his production skills, he was trained by DIT Sam Kretchmar, one of the first DITs in Local 600 when the classification was created. He’s now been working as a DIT for seven years, with a focus on quick-turnaround commercials for brands such as Tide, Snickers, Ford, Progressive, Nintendo and Starbucks.

Ford Mustang Mach-E

In a recent chat, Isaacson talked about the intensity and demands of a daily shoot.

What do you consider to be your most significant issue when preparing for a shoot?
Setting up the right workflow with hardware and software tools that will get the best results, quickly. Once I know the location and how the shoot is set up, I organize my cart for maximum efficiency. I use modular components that I can easily combine in different arrangements. I build out my cart with everything on a Yaeger Junior. cart for stage jobs, or I split it up and pare down to a vertical Magliner cart for tight locations. 

In both situations, I use two core components — a video distribution/live grading kit and my transcoding workstation. I then add monitors (typically 17-inch and 25-inch Sony OLEDs), scopes, battery back-up, wireless receivers and other components the shoot requires.

What software are you using?
Right now, I’m primarily using Assimilate’s DIT Pack that combines Live Looks for on-set live grading and Scratch Dailies for transcoding. The seamless software integration streamlines my workflow for a huge boost in time savings. Because the software runs on both Windows and macOS, it gives me flexibility in my work and in building my DIT cart for different projects. Working this way is miles beyond just applying a CDL as a starting point. Not only do I have more control over the live image with curves, especially hue-hue and hue-sat curves, but being able to apply those exact same curves in Scratch Dailies and then being able to edit them is another time savings.

I also use Hedge for archiving media with checksum. Lattice is a handy app for converting LUTs between applications and viewing the LUT curves, which can be useful when evaluating an imported LUT.

How have you built your dailies cart?
I’m working on a custom Windows 10 PC that I have built into a Pelican 1510 rolling case. With a 14-core Xeon CPU, Nvidia RTX 2080 Ti GPU and 8TB of SSD RAID, the speed for rendering dailies is incredibly fast. For my external RAIDs, I like SanDisk USB C SSDs. They offer great value for the performance. I also use external Glyph 4TB USB C RAID drives for high-performance shuttles

What do you enjoy most about working as a DIT?
My job is the most rewarding when I’m able to collaborate with the DPs to achieve their vision. Having the powerful color tools and curves editor available live not only helps me and the DP set looks faster, but it gives us more creative latitude too. A DP may not get a chance to sit in on the final grade, so achieving a look in the dailies is often the only chance the DPs get to review their work. 

How do you like to work with the DP and other departments?
Since I’m working directly for the DP, this is where most of the collaboration happens. Sometimes the gaffer and I will sort out a flickering light without bothering the DP, but for creative decisions, it’s important to follow the chain of command.

The relationship with post and production is important too. In commercials, we are usually making transcoded dailies as we work, so coordinating with post before a shoot ensures we generate the correct files for them. A big part of the DIT’s job is making sure all the deliverables for a project — source camera files, sound, transcoded dailies, LUT files, reference stills — are well-organized and quickly completed.

How are you handling the live grading?
While I have used a Mac for live grading, I also wanted reliable software that could run on other operating systems. Since I was familiar with the Scratch Dailies UI, Live Looks fit into my workflow. I can quickly pull up old grades and match grades to dailies as I work. Having a look “memory,” in addition to previously saved grades, makes it easy for me to bounce between grades while working on a particular shot. Also, exporting looks into Scratch is saving me time on my dailies grade

My hardware is built around Live Looks and FSI BoxIO, which pulls live images from it and all the embedded metadata. I built all the gear into a small-form-factor 8020 rack for portability. It contains a 16×16 AJA Kumo router and two FSI BoxIO units to live-grade up to four cameras.

What specific aspects do you like about your tools?
Bottom line, it’s the speed, reliability and flexibility. Being able to manage resolutions, frame rates and color spaces individually by timeline (or reel) is hugely helpful, especially when dealing with multiple cameras and formats. Instead of just setting scaling “entire image to fit,” I can actually see and adjust how Scratch is managing the scaling on a per-timeline basis. I’m able to easily handle footage for the same project from a wide range of cameras, even phones, as well as different formats and resolutions on the same camera. It’s also easy to generate multiple export formats at different resolutions from the same material.

What are some of your best practices that you can share?
Because the DIT is solely responsible for all of the footage from a shoot, I think one of the best practices is to approach a job with a calm and organized mindset. If I allow myself to get stressed out or overwhelmed on a shoot, that’s when I’m most likely to make a mistake.

When there’s a hiccup on set — corrupt media, accidental reformat, camera issues, etc. — I always take a step back, assess and move forward with a level head.

I also like to use manual systems for rechecking my work. I make manual media reports, not because there aren’t great software tools that can automate this for me, but because it forces me to recheck card transfers one by one.

I also like to line up all of the original camera clips and transcoded dailies from a day on overlapping timelines to ensure they are frame-accurate. Ideally, I will compare every transcode to the source clip before I reshoot a card. Whenever I do catch a mistake while doing one of these manual reviews, it reinforces my confidence in the system overall.

What tips do you have for someone starting out?
Find mentors if at all possible. I was fortunate in my career to have people who were willing to take the time to teach me when I was just starting out. Not only Sam for the formal DIT training, but also countless assistants who answered my questions and showed me how a set is run.

Being tech-savvy is a must. I had a lot of experience with editing software and building computers, which helped me pick up DIT-specific tools faster. It’s important to understand how cameras work and how the files are encoded, as well as color theory and an understanding of how people perceive images. I had formal photography training that was very helpful for this. There are a lot of resources online, but there’s also a lot of misunderstanding that has spawned misinformation, so it’s important to read lots of sources. Read about logarithmic encoding, color spaces, bit depth, bayer patterns, 4:4:4 4:2:2 4:2:0, latitude, dynamic range, display gamma … and keep going from there. As for working on set and how digital cinema cameras work, nothing beats hands-on experience. Get on set any way you can, or get a job at a rental house, and be respectful of the work. You don’t want to get in anyone’s way, so wait for the right times to ask lots of questions and watch the ACs carefully. Being curious, helpful and kind will go very far.

Podcast 12.4

Assimilate’s DIT Pack for Camera-to-Post Workflows

Assimilate has introduced its DIT Pack, a new product bundle that includes its advanced Scratch dailies software and Live Looks for live-grading single and multicam setups. The DIT Pack is designed for modern production workflows that require extensive previz on set to increase creative control and to streamline post workflows after the shoot.

The DIT Pack allows a seamless workflow that combines advanced live grading with dailies transcoding. Thanks to live streaming, it pushes everything out to remote and studio clients while capturing all camera metadata that will be used in VFX/post pipelines along the way. It’s available immediately on macOS and Windows.

The workflow starts with Live Looks for live-grading content from any number of cameras in real time. It’s also possible to add advanced effects like greenscreen background replacement and texture. When saving a grade, all the metadata — either input by the user or delivered via the live SDI signal from all cameras — will also be saved in the form of a readable text doc and an XML that is suited for further pipeline scripting in VFX/post. All grades and metadata are stored in an easy-to-approach folder structure that can simply be delivered to VFX/post.

In Scratch, all camera material is loaded, and the looks and metadata are matched and merged from the Live Looks folder structure, along with automated syncing of audio. All these tasks are automated and require almost no user interaction. Scratch will output in many formats as required by VFX/post:

– Offline DNX or Apple ProRes material, including all metadata for offline editing
– High-quality EXR VFX plates, including frame-based lens information and camera metadata
– Lightweight H.264/H.265 rushes for online review

The flexibility of Scratch allows the user to either consolidate all look and metadata files into a folder per day or copy the look and metadata next to each source media file, thereby easily linking in any other DI software. Producers and post supervisors receive an extensive clip report listing all relevant production information, and assistant editors get an ALE that contains all clip metadata, including dynamic CDL information to use in the target NLE.

At the live-grading stage, Live Looks provides a local web interface for clients on or near the set to browse grades, before-and-after snapshots and metadata via Wi-Fi during the shoot. At the same time, Live Looks allows streaming of all camera feeds through either an RTMP stream or via NDI directly into Zoom, Skype, SetStream.io or any other NDI-compatible software. At the dailies stage, Scratch separately outputs through RTMP and NDI for in-depth remote QC of all footage. After transcoding, Scratch allows for automatically publishing footage to the Assimilate Dailies Online web portal or to the COPRA dailies platform via direct script integration.

Assimilate’s DIT Pack is available at $1,399 for a permanent license, $799 annually or $99 monthly.

 

 

Assimilate Releases v9.3 of Live Looks and Live Assist

Assimilate has updated its Live Assist and Live Looks on-set software to v9.3, making on-set workflows more streamlined and efficient and offering better price/performance value for video assist and DIT live-grading functions. Assimilate has also added support for AJA and Blackmagic routers and has integrated NDI output, among other highly anticipated features.

Router control

Live Assist 9.3 streamlines the video-assist tasks in its all-in-one software solution with an extensive list of features. Live Assist includes all the key functions of record, playback, color grading, compositing, NDI output and router support for AJA and Blackmagic. There are no software limitations and no resolution limits. Video Assist operators can hook up as many cameras as their hardware can handle, at any resolution. Annual and monthly subscription models are also available for this one-stop video-assist solution, running on both Windows and macOS.

Live Assist 9.3 features are:

  • Camera-triggered multicam recording and instant replay
  • Any number of cameras
  • Any resolution
  • Availability on Windows and macOS
  • Color grading, including video scopes
  • Compositing
  • NDI output
  • RTMP live streaming to YouTube, SetStream.io and more
  • Integrated web server for Wi-Fi client review on or near set
  • Support for AJA and Blackmagic video routers
  • Support for Flanders, TVLogic and Teradek LUT boxes

Live Looks is a subset of Live Assist, with the focus on live grading of multiple camera signals in real time. Its tight integration with Assimilate’s Scratch software for dailies allows users to quickly apply looks, metadata and notes created in Live Looks to the camera footage and forward it all to post. This can be either directly embedded into the dailies or as a PDF or ALE report. With the recent 9.3 release, Assimilate has added AJA and Blackmagic router support as well as NDI output.

Live Looks 9.3 features are:

  • Availability on Windows and macOS
  • Support for AJA and Blackmagic video routers
  • Support for Flanders, TV Logic and Teradek LUT boxes
  • SDI metadata capture
  • Any number of cameras
  • Any resolution
  • CDL- and LUT-based color grading
  • Greenscreen compositing
  • Skin enhancement toolset
  • Texture effects (Grain, Diffusion, Glow, Vignette)
  • NDI output
  • Integrated web server for Wi-Fi look review on or near set
  • Single-cam recording

Assimilate’s Live Assist 9.3 is available at $1,995 for a permanent license, $1,495 per year or $325 per month. Live Looks 9.3 is available at $995 for a permanent license, $595 per year or $89 per month. Both are available now.

 

Disney+’s Safety: An ‘Online-all-the-Time’ Pipeline

By Daniel Restuccio

You don’t have to be a college football fan to enjoy the Reginald Hudlin-directed film Safety, which tells the story of Clemson University football player Ray McElrathbey, who — aided by his teammates and the community — enjoys a successful career on the field, while also raising his 11-year-old brother. To tell this story for Disney+, the filmmakers took a fresh spin on remote workflows inspired by episodic productions.

Doug Jones

The workflow concept, designed by executive producer Doug Jones, applied episodic television technology to a feature film pipeline. Feature film production tends to be more slow moving, while episodic television comes with fast turnaround times. This led Jones to a super-efficient “online-all-the-time” pipeline that could save time and money.

What exactly does he mean by online-all-the-time? The production used a DAM cart during the shoot, along with Blackmagic DaVinci Resolve from set through post, so the project could always be sourcing original camera files, without the need for proxies. At any time — when reviewing shots in production, in editorial during production or through post — the project could be viewed at full resolution. There was no longer a need to conform, since the edit can, with a click of a button, source camera-original files without rebuilding the edit, and with VFX, color and sound departments having access to the same high-resolution version of the cut.

Jones implemented the system on Safety in collaboration with director Hudlin, cinematographer Shane Hurlbut, ASC, and editor Terel Gibson, ACE.

Filming began in September 2019 in South Carolina at Clemson University and in Atlanta. Principal photography, dailies production and editing happened concurrently. Dailies, processed in DaVinci Resolve Studio, were not transcodes but Red original camera negatives (OCNs) with LUTs applied. They were viewable within six hours after cameras started rolling, and a full day of dailies was available and uploaded within 16 hours from the start of the day.

The on-set production and editorial were done at Blackhall Studios in Atlanta, where they used a host of Blackmagic gear, including HyperDeck Studio Mini recorders, ATEM 1 M/E 4K switchers, Teranex Mini SDI distribution 12G boxes, Blackmagic 12G Audio Monitors, SmartScope Duo 4K monitors and Smart Videohub routers.

The movie was edited and conformed in DaVinci Resolve Studio 16, but the team also had access to some beta features that ended up in Resolve 17. Editing systems were iMac Pros with 10-core and 18-core processors networked to Open Drives for shared storage. Editorial was never more than six hours behind the actual shoot, sometimes creating edits of scenes while they were still being shot.

We caught up with cinematographer Hurlbut, digital asset manager and colorist Michael Smollin, first assistant editor Rahul Das and editor Gibson to give us the details on this new way of working.

DAM cart

What Red camera did you use, and what flavor of Redcode Raw did you master in?
Shane Hurlbut: Safety was shot on Red Gemini cameras. We shot in Super 35 in 4K, using 5K to stabilize some shots. Most of the time, it was captured in 4K. The Redcode Raw settings were a Log 3G 10, Red wide gamut, and it was legacy mode. We did not use IPP2. But when we finished in the post process, we used IPP2 to finish the color grade. But in-camera, it was legacy for all of what we were shooting.

During camera testing, did you set a look with the director and build LUTs?
Hurlbut: We shot a series of camera tests with the Leica Summilux-Cs. We also tested the Leica Summicrons, the Cooke S4s and the Zeiss Master Primes. We really felt like the spherical nature of the Leica Summilux-Cs was going to be good with the lightweight nature of how we had to move with football.

We thought that the wide-angle lenses did not distort so much. I wanted to use 8mm, 12mm, 10mm and 16mm. We used those lenses a lot, along with the 18mm and the 21mm lenses. So these were lenses that we got up close and in-person with, and very personal with our actors as well as the sports action.

The LUTs were all designed. Once we decided on the Summilux-Cs, we then built our LUTs based on what it would look like at Clemson University. On and off campus was going to be very colorful — super saturated. It had to have really deep and beautiful skin tones. It was designed to be a complete polar opposite of the Atlanta projects where Ray and his little brother lived. I also shot all of Atlanta at 3,200 ISO and used more of the noise and grittiness to get a raw feel for Atlanta.

Clemson was shot at 800 ISO with the most range. I loved the Red sensor because if you want colors to explode, they explode beautifully. If you want to desaturate them, then you can desaturate them very nicely. Having that base color to be able to use was one of the main reasons that we went with the Red Gemini, and because of its ability to work in low-light conditions.

How many LUTS did you have?
Hurlbut: We had 39 LUTS created for all different environments: backlight neutral, backlight warm, backlight cool, overcast warm, overcast neutral and overcast cool. Then we made ones for saturation for Clemson and desaturated for Atlanta: side-lit, back-lit, overcast, all the different lighting conditions that you could imagine.

We built the LUTs, and they were then detailed on the slate. All those reports were filtered to Mike Smollin. He took those notes, graded the dailies and sent them off to Disney the same day we shot.

Colorist Michael Smollin

Shane, how involved were you in the final color grading, and what changes did you make from the look of the dailies? 
Hurlbut: I take a lot of time in preproduction to dial in the lookup tables and get them as close as possible. I want to be making those creative decisions on the day while I’m lighting and while I’m seeing the actors perform.

We were slammed with the pandemic, so I was able to give a first round of color correction. I went through the whole movie with Mike Smollin and we set the looks for every scene.

He had the LUTs that were on the slate and embedded in the camera notes. He also had images and a screen grab from my look book. These were associated to the LUTs and a description of what I feel the light should look like in the room.

How much footage was processed daily, and how did the footage get from the camera master RedCode Raw to your DAM? 
Michael Smollin: We had 45 shoot days and processed about 2TB per day. The camera Raw files got to my workstation via the Red mags/cards. Once the cards reached the DAM cart, they were loaded into the Red mag readers and copied to the DAM cart storage via the Resolve clone tool.

This happens at about 1500MB to 2000MB per second with the G-Tech G-Shuttle XL drives we used on Safety, but it can happen even faster with other flash/NVME drives if necessary. On Safety, copy times averaged about 7 to 12 minutes per card.

How did the camera stream get to the Blackmagic HyperDeck?
Smollin: We were using Teradek Bolt 3000s to get the signal to the Blackmagic HyperDecks.

Editor Terel Gibson

When you made dailies, how did you stream them to people on set and to studio people off set?
Smollin: Disney uses Pix as its dailies delivery method. Once the renders were made, they were pushed to Pix so everyone on set, or executives back in Burbank, could see the dailies. If dailies needed to be reviewed on set, we used DaVinci Resolve, the ATEM 1 M/E production studio and the Teranex Mini SDI (32 SDI outputs on the DAM cart) to distribute dailies anywhere on set with a monitor.

Had you prebuilt LUTs with cinematographer Shane Hurlbut for the look of the dailies? 
Smollin: As Shane said, we had 39 prebuilt LUTs. However, because the cart has the capability to grade the live image using Resolve, we took advantage of this and tweaked the look of the show as we shot. Having the ability to color correct on set is an important advantage of having the DAM cart on set.

Who built the DAM cart, and what gear did it have? How was it set up on location? Was there a video village?
Smollin: The cart was built by the workflow team in Burbank. The DAM cart was shipped to Atlanta and reassembled by me (as digital asset manager) at the production offices at Blackhall and then sent to Keslow Camera for the camera tests.

The cart was built around a MacBook Pro; a Blackmagic Micro Panel, ATEM, Teranex, and Audio Monitor; and a Sony BVM-X300 monitor. After that, the cart was transported in the sound trailer to each location. There was a video village set up on set every day, and the DAM cart was able to feed the signal to the village.

Can you walk us through the post process on Safety?
Terel Gibson: In September, we were set up at the production office in Atlanta at the Blackhall Studios. We were basically in Atlanta for the first half of principal photography and then came back to Los Angeles. We got things set up at Disney around Halloween, so we had a seamless transfer into post to start the director’s cut — basically a week after we finished principal photography.

At Blackhall, it was myself, first assistant editor Rahul Das, Daria Fissoun, who was tech support from Blackmagic Resolve, and in-house tech support Ramon Huggins.

In Los Angeles, Daria and Rahul stayed on, and second assistant editor Mark Jones joined us about halfway through the director’s cut. Mike Smollin was downstairs doing color timing, and then we added visual effects editor Matt DeJohn (who worked freelance on this film before becoming a Blackmagic staffer.)

Rahul, was this the first time you’ve cut a feature on DaVinci Resolve?
Rahul Das: Yes. My experience with Resolve was limited to occasional use for converting different frame-rate footages. I also knew it as powerful color correcting software. I have been working on Avid Media Composer for years and was excited to learn more about Resolve since it was increasingly being developed as a one-stop shop for editorial cutting and finishing, eliminating the need for a lab.

Assistant editor Rahul Das

When we started working on the project, I was immediately impressed by the different panels Resolve was designed for in-depth work in VFX, sound and color. It was initially overwhelming, because when we are cutting in the editorial offline, we are usually just expected to do temp reference work for VFX or sound design. In Resolve, even simple VFX work, like greenscreen keying or animating, seemed to require a certain level of know-how. But because the interface is very user-friendly, the learning curve is fast.

Since working on Safety, I have been using Resolve much more for my personal projects. I encourage all assistant editors to give it a try.

Gibson: It was the first time I cut a feature on Resolve. We did two weeks of training before production started, and then it was about just diving in.

When we got started, there were quite a few things that Blackmagic helped with [by giving access to the Version 17 beta], whether it was the trim mode, which is a feature in Avid, or the audio editing features.

Can you talk about your collaboration process with director Reginald Hudlin?
Gibson: This was our first time working together, but I was a huge fan of his work. It was one of those situations that was exceptional; there was a real connection in terms of our response to the material, our own personal taste.

He was also happy to offer some thoughts, go away for a couple of hours, come back and sit down while I’d show him things, which is a really nice way of working as well.

I remember when I started putting music in and showing him things during production, he was like, “I feel like you raided my iTunes library; this is exactly what I want.” When that happens, it’s sort of lightning in a bottle.

The serendipity that happens when your point of view and the director’s point of view sort of merge, that’s magic. And that was very much the case on this one. He came in a couple of times on weekends just to look at what I was up to, and he could rest assured that we were headed in a really good direction.

During the director’s cut, he was in every day until we had to quarantine. We were at the point where we had just shown the studio a couple of versions of the movie, and we were into the next phase, which was getting studio and audience feedback and incorporating that as well.

When COVID happened, we started working from home. We communicated via email and phone calls. We’d go through notes and suggestions, and I’d send him builds of things. We worked through the material that way and got through the entire finishing process all remotely.

How did the footage get from Michael Smollin’s system to your editing station?
Das: During production, I would get the Raw media on drives, the LUTs and a Resolve project with the master clips synced to production audio, which I would copy over to my main project. When I was ready to check sync, I would make groups and prep the footage for the editor.

DAM cart

What about visual effects and audio? Were those also done in Resolve? 
Das: The big football game shoot was a challenge with more than 20 cameras rolling, and we had to make optimized media for those scenes.

Resolve has great built-in audio effects that are easy to use. Since we were working with the Raw media, there was no need to send out the cut to the lab before studio screenings — the colorist came to our post facility and addressed color correction notes while the editor continued working. For VFX, we exported out EXRs and sent them to the VFX vendors.

Smollin: About 90 of the VFX shots were done in Fusion by Matt DeJohn, who is a Blackmagic staffer. The rest were done by Crafty Apes. The head of VFX for Disney, David Taritero, was working remotely in Denver, so the reviews for VFX were done in Resolve using remote grading between Denver and Burbank. Audio was exported to Pro Tools, and then the final audio deliverables were brought back into Resolve.

How was the final conform and grade of the movie done, and what deliverables did you send to Disney?
Smollin: There was no conform. This movie was cut in Resolve from the Raw Red OCN. No conform is necessary using this workflow. The final grade was done in quarantine due to COVID.

I worked in the cutting rooms by myself with Resolve and a Sony BVM-X300. Then I was also cleared to work on the Disney lot by myself in the Frank G. Wells building. I would grade a reel, and then Reggie would review and give notes. We used Streambox for studio review.

Deliverables were Disney IMF packages — HDR 4K 16-bit lossless and SDR UHD 10-bit lossless, NAM 16-bit Tiffs, QuickTimes.


Daniel Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

 

 

 

 

Assimilate Scratch Integrates NDI for Real-Time Streaming on Set

Assimilate has integrated NDI technology into Assimilate Scratch’s advanced tools for on-set and post workflows. NDI is part of the Vizrt Group along with the NewTek and Vizrt brands.

Using existing networks or WiFi, the combination of NDI and Scratch tools gives DITs and post artists the ability to stream video content in real time to other production pros or clients. What was until now a cumbersome process, using cables and video I/O cards, is now eliminated, which helps in lowering costs.

Users can now send video output in real time from Scratch or Play Pro to:

  • Chrome browser and from there on to SetStream.io.
  • Zoom or Skype for remote client-attended color sessions or conference calls.
  • Open Broadcaster Software (OBS) to stream to any streaming portal.
  • NDI Studio Monitor to view in a separate screening room. This is ideal for COVID-19 distancing guidelines.

Another time-saver is that as an intuitive player, the NDI Studio Monitor can go out to the local network to find NDI streams and the user can choose which one to view. The NDI Studio Monitor is a free tool that allows users to view or display any number of NDI video sources across a network from any compatible laptop or workstation.

Jeff Sousa, colorist at Brooklyn’s Dungeon Beach, has been testing out the workflow. “Scratch 9.3’s native NDI integration adds speed and flexibility to my streaming sessions without the need to purchase I/O hardware or LUT boxes. It allows Scratch to uplink directly to SetStream.io via the fast WebRTC protocol, giving me latency-free interactions with my clients and great image quality,” he says. “And since NDI is ‘just another output’ to Scratch, I can control exactly what the client sees, toggling masks on and off, and even apply a stream-only display LUT to calibrate their viewing device remotely. When streaming via NDI, I can also round-trip to Adobe After Effects for rotoscoping or Avid Media Composer for conform, and the client won’t lose signal on their program, making Scratch 9.3 a powerful finishing hub for remote workflows.”

Available immediately in Scratch 9.3, and via Assimilate’s Creative Reboot program, pricing starts at $59 monthly and $495 annually. Play Pro 9.3 starts at $10 monthly and $99 annually.

 

 

 

Moxion Offers Dolby Vision for Remote HDR Reviews

Moxion Immediates instant dailies service platform now integrates Dolby Vision, Dolby’s advanced imaging technology that combines high dynamic range (HDR) with wide color gamut (WCG) capabilities.

With the ability to monitor in Dolby Vision for remote cuts, color, dailies and VFX review, creatives can make decisions based on the image the director, colorist and cinematographer envisioned. Dolby Vision support will be available to users with compatible devices through the Moxion app on iOS, iPadOS and tvOS.

Increasingly, productions are shooting and monitoring on set in HDR, which is fueled by demand from leading content providers and streaming services. With many films and TV shows being shot simultaneously across multiple units and countries, being able to monitor in Dolby Vision enables robust feedback and certainty of creative intent across all units.

“Your Dolby Vision assets can be ingested, transcoded and played back with the secure, high-speed Moxion ecosystem,” says Hugh Calveley, CEO of Moxion. “Dolby Vision, plus Moxion’s ability to make frame-accurate annotation and comments, gives certainty to the colorist, editor and VFX team that feedback is against their original creation.”

Moxion’s integration of Dolby Vision — which offers lifelike picture quality with highlights that are up to 40 times brighter and blacks that are 10 times darker than a standard picture — into its platform will provide the production, post and visual effects communities with unmatched control over the image. Unlike HDR10, Dolby Vision allows DPs and colorists to adjust details for individual scenes on a frame-by-frame basis down to the exact frame.

The need for such a solution is even more urgent during production in the current pandemic. Reduced numbers on set and remote distributed workflows are the new normal, and Moxion’s support for Dolby Vision helps ensure that all participants are seeing identical images.

Calveley adds, “COVID has forced the industry into innovating ways to sign off on cuts, VFX and color grades in scenarios where you can’t bring people into the same room. Dolby Vision guarantees that everyone is making creative decisions based on the same image.”

Quick Chat: The Rebel Fleet’s Michael Urban talks on-set workflows

When shooting major motion pictures and episodic television with multiple crews in multiple locations, production teams need a workflow that gives them fast access and complete control of the footage across the entire production, from the first day of the shoot to the last day of post. This is Wellington, New Zealand-based The Rebel Fleet’s reason for being.

What exactly do they do? Well we reached out to managing director Michael Urban to find out.

Can you talk more about what you do and what types of workflows you supply?
The Rebel Fleet supplies complete workflow solutions, from on-set Qtake video assist and DIT to dailies, QC, archive and delivery to post. By managing the entire workflow, we can provide consistency and certainty around the color pipeline, monitor calibration, crew expertise and communication, and production can rely on one team to take care of that part of the workflow.

We have worked closely with Moxion many times and use its Immediates workflow, which enables automated uploads direct from video assist into its secure dailies platform. Anyone with access to the project can view rushes and metadata from set moments after the video is shot. This also enables different shooting units to automatically and securely share media. Two units shooting in different countries can see what each other has shot, including all camera and scene/take metadata. This is then available and catalogued directly into the video assist system. We have a lot of experience working alongside camera and VFX on-set as well as delivering to post, making sure we are delivering exactly what’s needed in the right formats.

You recently worked on a film that was shot in New Zealand and China, and you sent crews to China. Can you talk about that workflow a bit and name the film?
I can’t name the film yet, but I can tell you that it’s in the adventure genre and is coming out in the second half of 2020. The main pieces of software are Colorfront On-Set Dailies for processing all the media and Yoyotta for downloading and verifying media. We also use Avid for some edit prep before handing over to editorial.

How did you work with the DP and director? Can you talk about those relationships on this particular film?
On this shoot the DP and director had rushes screenings each night to go over the main unit and second unit rushes and make sure the dailies grade was exactly what they wanted. This was the last finesse before handing over dailies to editorial, so it had to be right. As rushes were being signed off, we would send them off to the background render engine, which would create four different outputs in multiple resolutions and framing. This meant that moments after the last camera mag was signed off, the media was ready for Avid prep and delivery. Our data team worked hard to automate as many processes as possible so there would be no long nights sorting reports and sheets. That work happened as we went throughout the day instead of leaving a multitude of tasks for the end of the day.

How do your workflows vary from project to project?
Every shoot is approached with a clean slate, and we work with the producers, DP and post to make sure we create a workflow that suits the logistical, budgetary and technical needs of that shoot. We have a tool kit that we rely on and use it to select the correct components required. We are always looking for ways to innovate and provide more value for the bottom line.

You mentioned using Colorfront tools, what does that offer you? And what about storage? Seems like working on location means you need a solid way to back up.
Colorfront On-Set Dailies takes care of QC, grade, sound sync and metadata. All of our shared storage is built around Quantum Xcellis, plus the Quantum QXS hybrid storage systems for online and nearline. We create the right SAN for the job depending on the amount of storage and clients required for that shoot.

Can you name projects you’ve worked on in the past as well as some recent work?
Warner Bros.’ The Meg, DreamWorks’ Ghost in the Shell, Sonar’s The Shannara Chronicles, STX Entertainment’s Adrift, Netflix’s The New Legends of Monkey and The Letter for the King and Blumhouse’s Fantasy Island.

Colorfront at NAB with 8K HDR, product updates

Colorfront, which makes on-set dailies and transcoding systems, has rolled out new 8K HDR capabilities and updates across its product lines. The company has also deepened its technology partnership with AJA and entered into a new collaboration with Pomfort to bring more efficient color and HDR management on-set.

Colorfront Transkoder is a post workflow tool for handling UHD, HDR camera, color and editorial/deliverables formats, with recent customers such as Sky, Pixelogic, The Picture Shop and Hulu. With a new HDR GUI, Colorfront’s Transkoder 2019 performs the realtime decompression/de-Bayer/playback of Red and Panavision DXL2 8K R3D material displayed on a Samsung 82-inch Q900R QLED 8K Smart TV in HDR and in full 8K resolution (7680 X 4320). The de-Bayering process is optimized through Nvidia GeForce RTX graphics cards with Turing GPU architecture (also available on Colorfront On-Set Dailies 2019), with 8K video output (up to 60p) using AJA Kona 5 video cards.

“8K TV sets are becoming bigger, as well as more affordable, and people are genuinely awestruck when they see 8K camera footage presented on an 8K HDR display,” said Aron Jaszberenyi, managing director, Colorfront. “We are actively working with several companies around the world originating 8K HDR content. Transkoder’s new 8K capabilities — across on-set, post and mastering — demonstrate that 8K HDR is perfectly accessible to an even wider range of content creators.”

Powered by a re-engineered version of Colorfront Engine and featuring the HDR GUI and 8K HDR workflow, Transkoder 2019 supports camera/editorial formats including Apple ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE (High Density Encoding).

Transkoder 2019’s mastering toolset has been further expanded to support Dolby Vision 4.0 as well as Dolby Atmos for the home with IMF and Immersive Audio Bitstream capabilities. The new Subtitle Engine 2.0 supports CineCanvas and IMSC 1.1 rendering for preservation of content, timing, layout and styling. Transkoder can now also package multiple subtitle language tracks into the timeline of an IMP. Further features support fast and efficient audio QC, including solo/mute of individual tracks on the timeline, and a new render strategy for IMF packages enabling independent audio and video rendering.

Colorfront also showed the latest versions of its On-Set Dailies and Express Dailies products for motion pictures and episodic TV production. On-Set Dailies and Express Dailies both now support ProRes RAW, Blackmagic RAW, ARRI Alexa LF/Alexa Mini LF and Codex HDE. As with Transkoder 2019, the new version of On-Set Dailies supports real-time 8K HDR workflows to support a set-to-post pipeline from HDR playback through QC and rendering of HDR deliverables.

In addition, AJA Video Systems has released v3.0 firmware for its FS-HDR realtime HDR/WCG converter and frame synchronizer. The update introduces enhanced coloring tools together with several other improvements for broadcast, on-set, post and pro AV HDR production developed by Colorfront.

A new, integrated Colorfront Engine Film Mode offers an ACES-based grading and look creation toolset with ASC Color Decision List (CDL) controls, built-in LOOK selection including film emulation looks, and variable Output Mastering Nit Levels for PQ, HLG Extended and P3 colorspace clamp.

Since launching in 2018, FS-HDR has been used on a wide range of TV and live outside broadcast productions, as well as motion pictures including Paramount Pictures’ Top Gun: Maverick, shot by Claudio Miranda, ASC.

Colorfront licensed its HDR Image Analyzer software to AJA for AJA’s HDR Image Analyzer in 2018. A new version of AJA HDR Image Analyzer is set for release during Q3 2019.

Finally, Colorfront and Pomfort have teamed up to integrate their respective HDR-capable on-set systems. This collaboration, harnessing Colorfront Engine, will include live CDL reading in ACES pipelines between Colorfront On-Set/Express Dailies and Pomfort LiveGrade Pro, giving motion picture productions better control of HDR images while simplifying their on-set color workflows and dailies processes.

Atomos’ new Shogun 7: HDR monitor, recorder, switcher

The new Atomos Shogun 7 is a seven-inch HDR monitor, recorder and switcher that offers an all-new 1500-nit, daylight-viewable, 1920×1200 panel with a 1,000,000:1 contrast ratio and 15+ stops of dynamic range displayed. It also offers ProRes RAW recording and realtime Dolby Vision output. Shogun 7 will be available in June 2019, priced at $1,499.

The Atomos screen uses a combination of advanced LED and LCD technologies which together offer deeper, better blacks the company says rivals OLED screens, “but with the much higher brightness and vivid color performance of top-end LCDs.”

A new 360-zone backlight is combined with this new screen technology and controlled by the Dynamic AtomHDR engine to show millions of shades of brightness and color. It allows Shogun 7 to display 15+ stops of real dynamic range on-screen. The panel, says Atomos, is also incredibly accurate, with ultra-wide color and 105% of DCI-P3 covered, allowing for the same on-screen dynamic range, palette of colors and shades that your camera sensor sees.

Atomos and Dolby have teamed up to create Dolby Vision HDR “live” — a tool that allows you to see HDR live on-set and carry your creative intent from the camera through into HDR post. Dolby have optimized their target display HDR processing algorithm which Atomos has running inside the Shogun 7. It brings realtime automatic frame-by-frame analysis of the Log or RAW video and processes it for optimal HDR viewing on a Dolby Vision-capable TV or monitor over HDMI. Connect Shogun 7 to the Dolby Vision TV and AtomOS 10 automatically analyzes the image, queries the TV and applies the right color and brightness profiles for the maximum HDR experience on the display.

Shogun 7 records images up to 5.7kp30, 4kp120 or 2kp240 slow motion from compatible cameras, in RAW/Log or HLG/PQ over SDI/HDMI. Footage is stored directly to AtomX SSDmini or approved off-the-shelf SATA SSD drives. There are recording options for Apple ProRes RAW and ProRes, Avid DNx and Adobe CinemaDNG RAW codecs. Shogun 7 has four SDI inputs plus a HDMI 2.0 input, with both 12G-SDI and HDMI 2.0 outputs. It can record ProRes RAW in up to 5.7kp30, 4kp120 DCI/UHD and 2kp240 DCI/HD, depending on the camera’s capabilities. Also, 10-bit 4:2:2 ProRes or DNxHR recording is available up to 4Kp60 or 2Kp240. The four SDI inputs enable the connection of most quad-link, dual-link or single-link SDI cinema cameras. Pixels are preserved with data rates of up to 1.8Gb/s.

In terms of audio, Shogun 7 eliminates the need for a separate audio recorder. Users can add 48V stereo mics via an optional balanced XLR breakout cable, or select mic or line input levels, plus record up to 12 channels of 24/96 digital audio from HDMI or SDI. Monitoring selected stereo tracks is via the 3.5mm headphone jack. There are dedicated audio meters, gain controls and adjustments for frame delay.

Shogun 7 features the latest version of the AtomOS 10 touchscreen interface, first seen on the Ninja V.  The new body of Shogun 7 has a Ninja V-like exterior with ARRI anti-rotation mounting points on the top and bottom of the unit to ensure secure mounting.

AtomOS 10 on Shogun 7 has the full range of monitoring tools, including Waveform, Vectorscope, False Color, Zebras, RGB parade, Focus peaking, Pixel-to-pixel magnification, Audio level meters and Blue only for noise analysis.

Shogun 7 can also be used as a portable touchscreen-controlled multi-camera switcher with asynchronous quad-ISO recording. Users can switch up to four 1080p60 SDI streams, record each plus the program output as a separate ISO, then deliver ready-for-edit recordings with marked cut-points in XML metadata straight to your NLE. The current Sumo19 HDR production monitor-recorder will also gain the same functionality in a free firmware update.

There is asynchronous switching, plus use genlock in and out to connect to existing AV infrastructure. Once the recording is over, users can import the XML file into an NLE and the timeline populates with all the edits in place. XLR audio from a separate mixer or audio board is recorded within each ISO, alongside two embedded channels of digital audio from the original source. The program stream always records the analog audio feed as well as a second track that switches between the digital audio inputs to match the switched feed.