NBCUni 9.5.23

Disney+’s Safety: An ‘Online-all-the-Time’ Pipeline

By Daniel Restuccio

You don’t have to be a college football fan to enjoy the Reginald Hudlin-directed film Safety, which tells the story of Clemson University football player Ray McElrathbey, who — aided by his teammates and the community — enjoys a successful career on the field, while also raising his 11-year-old brother. To tell this story for Disney+, the filmmakers took a fresh spin on remote workflows inspired by episodic productions.

Doug Jones

The workflow concept, designed by executive producer Doug Jones, applied episodic television technology to a feature film pipeline. Feature film production tends to be more slow moving, while episodic television comes with fast turnaround times. This led Jones to a super-efficient “online-all-the-time” pipeline that could save time and money.

What exactly does he mean by online-all-the-time? The production used a DAM cart during the shoot, along with Blackmagic DaVinci Resolve from set through post, so the project could always be sourcing original camera files, without the need for proxies. At any time — when reviewing shots in production, in editorial during production or through post — the project could be viewed at full resolution. There was no longer a need to conform, since the edit can, with a click of a button, source camera-original files without rebuilding the edit, and with VFX, color and sound departments having access to the same high-resolution version of the cut.

Jones implemented the system on Safety in collaboration with director Hudlin, cinematographer Shane Hurlbut, ASC, and editor Terel Gibson, ACE.

Filming began in September 2019 in South Carolina at Clemson University and in Atlanta. Principal photography, dailies production and editing happened concurrently. Dailies, processed in DaVinci Resolve Studio, were not transcodes but Red original camera negatives (OCNs) with LUTs applied. They were viewable within six hours after cameras started rolling, and a full day of dailies was available and uploaded within 16 hours from the start of the day.

The on-set production and editorial were done at Blackhall Studios in Atlanta, where they used a host of Blackmagic gear, including HyperDeck Studio Mini recorders, ATEM 1 M/E 4K switchers, Teranex Mini SDI distribution 12G boxes, Blackmagic 12G Audio Monitors, SmartScope Duo 4K monitors and Smart Videohub routers.

The movie was edited and conformed in DaVinci Resolve Studio 16, but the team also had access to some beta features that ended up in Resolve 17. Editing systems were iMac Pros with 10-core and 18-core processors networked to Open Drives for shared storage. Editorial was never more than six hours behind the actual shoot, sometimes creating edits of scenes while they were still being shot.

We caught up with cinematographer Hurlbut, digital asset manager and colorist Michael Smollin, first assistant editor Rahul Das and editor Gibson to give us the details on this new way of working.

DAM cart

What Red camera did you use, and what flavor of Redcode Raw did you master in?
Shane Hurlbut: Safety was shot on Red Gemini cameras. We shot in Super 35 in 4K, using 5K to stabilize some shots. Most of the time, it was captured in 4K. The Redcode Raw settings were a Log 3G 10, Red wide gamut, and it was legacy mode. We did not use IPP2. But when we finished in the post process, we used IPP2 to finish the color grade. But in-camera, it was legacy for all of what we were shooting.

During camera testing, did you set a look with the director and build LUTs?
Hurlbut: We shot a series of camera tests with the Leica Summilux-Cs. We also tested the Leica Summicrons, the Cooke S4s and the Zeiss Master Primes. We really felt like the spherical nature of the Leica Summilux-Cs was going to be good with the lightweight nature of how we had to move with football.

We thought that the wide-angle lenses did not distort so much. I wanted to use 8mm, 12mm, 10mm and 16mm. We used those lenses a lot, along with the 18mm and the 21mm lenses. So these were lenses that we got up close and in-person with, and very personal with our actors as well as the sports action.

The LUTs were all designed. Once we decided on the Summilux-Cs, we then built our LUTs based on what it would look like at Clemson University. On and off campus was going to be very colorful — super saturated. It had to have really deep and beautiful skin tones. It was designed to be a complete polar opposite of the Atlanta projects where Ray and his little brother lived. I also shot all of Atlanta at 3,200 ISO and used more of the noise and grittiness to get a raw feel for Atlanta.

Clemson was shot at 800 ISO with the most range. I loved the Red sensor because if you want colors to explode, they explode beautifully. If you want to desaturate them, then you can desaturate them very nicely. Having that base color to be able to use was one of the main reasons that we went with the Red Gemini, and because of its ability to work in low-light conditions.

How many LUTS did you have?
Hurlbut: We had 39 LUTS created for all different environments: backlight neutral, backlight warm, backlight cool, overcast warm, overcast neutral and overcast cool. Then we made ones for saturation for Clemson and desaturated for Atlanta: side-lit, back-lit, overcast, all the different lighting conditions that you could imagine.

We built the LUTs, and they were then detailed on the slate. All those reports were filtered to Mike Smollin. He took those notes, graded the dailies and sent them off to Disney the same day we shot.

Colorist Michael Smollin

Shane, how involved were you in the final color grading, and what changes did you make from the look of the dailies? 
Hurlbut: I take a lot of time in preproduction to dial in the lookup tables and get them as close as possible. I want to be making those creative decisions on the day while I’m lighting and while I’m seeing the actors perform.

We were slammed with the pandemic, so I was able to give a first round of color correction. I went through the whole movie with Mike Smollin and we set the looks for every scene.

He had the LUTs that were on the slate and embedded in the camera notes. He also had images and a screen grab from my look book. These were associated to the LUTs and a description of what I feel the light should look like in the room.

How much footage was processed daily, and how did the footage get from the camera master RedCode Raw to your DAM? 
Michael Smollin: We had 45 shoot days and processed about 2TB per day. The camera Raw files got to my workstation via the Red mags/cards. Once the cards reached the DAM cart, they were loaded into the Red mag readers and copied to the DAM cart storage via the Resolve clone tool.

This happens at about 1500MB to 2000MB per second with the G-Tech G-Shuttle XL drives we used on Safety, but it can happen even faster with other flash/NVME drives if necessary. On Safety, copy times averaged about 7 to 12 minutes per card.

How did the camera stream get to the Blackmagic HyperDeck?
Smollin: We were using Teradek Bolt 3000s to get the signal to the Blackmagic HyperDecks.

Editor Terel Gibson

When you made dailies, how did you stream them to people on set and to studio people off set?
Smollin: Disney uses Pix as its dailies delivery method. Once the renders were made, they were pushed to Pix so everyone on set, or executives back in Burbank, could see the dailies. If dailies needed to be reviewed on set, we used DaVinci Resolve, the ATEM 1 M/E production studio and the Teranex Mini SDI (32 SDI outputs on the DAM cart) to distribute dailies anywhere on set with a monitor.

Had you prebuilt LUTs with cinematographer Shane Hurlbut for the look of the dailies? 
Smollin: As Shane said, we had 39 prebuilt LUTs. However, because the cart has the capability to grade the live image using Resolve, we took advantage of this and tweaked the look of the show as we shot. Having the ability to color correct on set is an important advantage of having the DAM cart on set.

Who built the DAM cart, and what gear did it have? How was it set up on location? Was there a video village?
Smollin: The cart was built by the workflow team in Burbank. The DAM cart was shipped to Atlanta and reassembled by me (as digital asset manager) at the production offices at Blackhall and then sent to Keslow Camera for the camera tests.

The cart was built around a MacBook Pro; a Blackmagic Micro Panel, ATEM, Teranex, and Audio Monitor; and a Sony BVM-X300 monitor. After that, the cart was transported in the sound trailer to each location. There was a video village set up on set every day, and the DAM cart was able to feed the signal to the village.

Can you walk us through the post process on Safety?
Terel Gibson: In September, we were set up at the production office in Atlanta at the Blackhall Studios. We were basically in Atlanta for the first half of principal photography and then came back to Los Angeles. We got things set up at Disney around Halloween, so we had a seamless transfer into post to start the director’s cut — basically a week after we finished principal photography.

At Blackhall, it was myself, first assistant editor Rahul Das, Daria Fissoun, who was tech support from Blackmagic Resolve, and in-house tech support Ramon Huggins.

In Los Angeles, Daria and Rahul stayed on, and second assistant editor Mark Jones joined us about halfway through the director’s cut. Mike Smollin was downstairs doing color timing, and then we added visual effects editor Matt DeJohn (who worked freelance on this film before becoming a Blackmagic staffer.)

Rahul, was this the first time you’ve cut a feature on DaVinci Resolve?
Rahul Das: Yes. My experience with Resolve was limited to occasional use for converting different frame-rate footages. I also knew it as powerful color correcting software. I have been working on Avid Media Composer for years and was excited to learn more about Resolve since it was increasingly being developed as a one-stop shop for editorial cutting and finishing, eliminating the need for a lab.

Assistant editor Rahul Das

When we started working on the project, I was immediately impressed by the different panels Resolve was designed for in-depth work in VFX, sound and color. It was initially overwhelming, because when we are cutting in the editorial offline, we are usually just expected to do temp reference work for VFX or sound design. In Resolve, even simple VFX work, like greenscreen keying or animating, seemed to require a certain level of know-how. But because the interface is very user-friendly, the learning curve is fast.

Since working on Safety, I have been using Resolve much more for my personal projects. I encourage all assistant editors to give it a try.

Gibson: It was the first time I cut a feature on Resolve. We did two weeks of training before production started, and then it was about just diving in.

When we got started, there were quite a few things that Blackmagic helped with [by giving access to the Version 17 beta], whether it was the trim mode, which is a feature in Avid, or the audio editing features.

Can you talk about your collaboration process with director Reginald Hudlin?
Gibson: This was our first time working together, but I was a huge fan of his work. It was one of those situations that was exceptional; there was a real connection in terms of our response to the material, our own personal taste.

He was also happy to offer some thoughts, go away for a couple of hours, come back and sit down while I’d show him things, which is a really nice way of working as well.

I remember when I started putting music in and showing him things during production, he was like, “I feel like you raided my iTunes library; this is exactly what I want.” When that happens, it’s sort of lightning in a bottle.

The serendipity that happens when your point of view and the director’s point of view sort of merge, that’s magic. And that was very much the case on this one. He came in a couple of times on weekends just to look at what I was up to, and he could rest assured that we were headed in a really good direction.

During the director’s cut, he was in every day until we had to quarantine. We were at the point where we had just shown the studio a couple of versions of the movie, and we were into the next phase, which was getting studio and audience feedback and incorporating that as well.

When COVID happened, we started working from home. We communicated via email and phone calls. We’d go through notes and suggestions, and I’d send him builds of things. We worked through the material that way and got through the entire finishing process all remotely.

How did the footage get from Michael Smollin’s system to your editing station?
Das: During production, I would get the Raw media on drives, the LUTs and a Resolve project with the master clips synced to production audio, which I would copy over to my main project. When I was ready to check sync, I would make groups and prep the footage for the editor.

DAM cart

What about visual effects and audio? Were those also done in Resolve? 
Das: The big football game shoot was a challenge with more than 20 cameras rolling, and we had to make optimized media for those scenes.

Resolve has great built-in audio effects that are easy to use. Since we were working with the Raw media, there was no need to send out the cut to the lab before studio screenings — the colorist came to our post facility and addressed color correction notes while the editor continued working. For VFX, we exported out EXRs and sent them to the VFX vendors.

Smollin: About 90 of the VFX shots were done in Fusion by Matt DeJohn, who is a Blackmagic staffer. The rest were done by Crafty Apes. The head of VFX for Disney, David Taritero, was working remotely in Denver, so the reviews for VFX were done in Resolve using remote grading between Denver and Burbank. Audio was exported to Pro Tools, and then the final audio deliverables were brought back into Resolve.

How was the final conform and grade of the movie done, and what deliverables did you send to Disney?
Smollin: There was no conform. This movie was cut in Resolve from the Raw Red OCN. No conform is necessary using this workflow. The final grade was done in quarantine due to COVID.

I worked in the cutting rooms by myself with Resolve and a Sony BVM-X300. Then I was also cleared to work on the Disney lot by myself in the Frank G. Wells building. I would grade a reel, and then Reggie would review and give notes. We used Streambox for studio review.

Deliverables were Disney IMF packages — HDR 4K 16-bit lossless and SDR UHD 10-bit lossless, NAM 16-bit Tiffs, QuickTimes.


Daniel Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.

 

 

 

 


3 thoughts on “Disney+’s Safety: An ‘Online-all-the-Time’ Pipeline

  1. Gavin Greenwalt

    “The Redcode Raw settings were a Log 3G 10, Red wide gamut, and it was legacy mode. We did not use IPP2.“

    I don’t believe it’s possible to decode Redcode to Red Wide Gamut through legacy. RWG color space by definition is IPP2. I believe he misremembers a detail.

    Reply
  2. Daniel J Restuccio

    According to Red Tech Support: “You can select both REDWideGammutRGB and Log3G10 within the Legacy Pipeline (if the firmware is up to date), but it will not have the Output Transform that is applied when using IPP2. REDWideGammutRGB (RWG) is our latest color space that IPP2 uses but is just one component of IPP2. You can find more information about IPP2 HERE.”

    Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.