Light Iron and the series’ post producer on shooting this series to make it look like a Zoom project.
By Daniel Restuccio
Netflix’s Social Distance is a new COVID-themed series brought to you by creator and executive producer Hilary Weisman Graham (Orange Is the New Black). For this eight-part fictional anthology series, Graham and her team focused on how lives have been changed during the isolation of the pandemic. It shows how people and their families are coping during this very stressful time, and while there is sadness and loneliness, there are laughs and acts of love.
Each episode features people interacting on what appears to be a Zoom call, but there are no blurry pictures or dropped audio. In fact, the series features production values that rival any traditionally produced show, thanks to some slick and clever producing and post workflows.
The series started shooting with Canon C300 Mark II (XF-AVC, 4096×2160, 23.98fps) and then switched to iPhone 11s (3840 x 2160) using the FilmicPro app and the Log codec add-on. Episodes 101 and 108 were shot mostly on C300, with some iPhone. The rest were shot almost entirely with an iPhone.
Adding to the series production value was a significant amount of visual effects — 388 VFX shots and 51 unique graphic interfaces across the eight parts. Light Iron supervising colorist Ian Vertovec and Mr. Wolf VFX supervisor Mike Pryor used a unique workflow where they chose to drop in the final VFX into the conform and then color.
To find out how this show got made, we caught up with Light Iron’s head of workflow strategy, Katie Fellion, Vertovec and the show’s post producer, Ashley Glazier, to get the lowdown on some of the behind-the-scenes technology that made the eight 22-minute episodes possible during a tight June-October production schedule.
How did DPs Mark Schwartzbard, Alison Kelly and Pedro Luque give instructions to the actors on how to shoot and light themselves?
Ashley Glazier: The prep was handled over many Zoom sessions with the actors and each department. During location scouts over Zoom, our DPs would judge the best light within the actors’ houses to determine if any small lights needed to be sent to them.
When recording on the Canon C300, a lot of the camera’s settings were able to be controlled remotely by our DPs or camera assistants. When we shot on the iPhone, we used Filmic Pro, and the DP would have the phone set up for the actor and give any instructions on settings over Zoom.
How was location audio accomplished?
Glazier: We started off sending actors a small audio kit that could be monitored by our sound mixer. Once restrictions lifted some in places like New York we were able to send a sound mixer to setup outside of the location and monitor sound quality there.
How did the actors get their footage to Light Iron?
Glazier: We received our dailies over Aspera. Light Iron would upload footage there for all of our editors and AEs to have access to. All editorial was working from home and they were cutting on Avid Media Composer.
Fellion: Our LA, New York and Atlanta facilities received cards (or phones) shipped in from the physical production location. At each Light Iron location, we media-managed to our local servers (Quantum StorNext SAN volume) and then digitally transferred the OCF to a consolidated dataset based at the LA facility. This spinning disk online data storage was subsequently used for expediting VFX pulls and conforming episodes.
What computer system and software did Light Iron dailies colorist Greg Pastore use to process dailies?
Fellion: Our dailies colorist Greg Pastore used Colorfront’s Express Dailies to color and process dailies. Archive and data management was handled by YoYotta.
How did those dailies get to editors Tyler Cook, Amy Fleming and Liza Cardinale?
Fellion: We delivered dailies electronically to the editorial team via a Signiant MediaShuttle link.
The show appears to take place over Zoom. How much of this is VFX illusion? Were all the computer interfaces prebuilt composites?
Vertovec: Yes. Mr. Wolf built all the computer interfaces. I built an input LUT for each camera so they could work in a singular Log working space. I also built an sRGB inverse so graphics made in sRGB could also be comped in Log. We worked this way for about two episodes, but when they switched to all Rec. 709 cameras (iPhones), we did those episodes’ comps in pure Rec. 709.
Were those composites already baked before they were handed off to color correction?
Vertovec: Everything was flattened by Mr. Wolf and delivered as EXR with embedded mattes. There were a lot of mattes — every video layer and graphic was its own element. I think the most on one shot was 54 mattes.
One of the main challenges was managing the Dolby Vision SDR downconvert. We were working in HDR, as it is a Netflix original, but Dolby Vision is made to make natural-looking photography downconvert from HDR to SDR. However, because most peoples’ computer interfaces have a lot of white space and bright areas, the Dolby Vision algorithm was constantly trying to darken the whole frame. It took a lot of tweaking to get the balance just right on the SDR.
Can you talk about the Mr. Wolf composite of the first and last shot of Episode 101 — a 30-person Zoom call with approximately 54 mattes?
Vertovec: They did a wonderful job. There was a lot of grading on the Zoom call scenes, but the scenes with overlapping computer screen windows — surfing the web of social media — were always much more complicated as they usually involved complicated addition and subtraction of multiple mattes to get the combination just right.
The unique combination of some windows being overplayed by another window or graphic meant Mr. Wolf couldn’t give me every combination of every matte. So mostly they would deliver the basic shapes, and I would add, subtract and invert all the elements into the exact combo I needed. I was able to do most of the mattes that way, however, one or two combinations were too complex, and I needed Mr. Wolf to specially make them.
What is your setup at home?
Vertovec: I have a FilmLight Baselight One with 250TB SAS RAID (from Maxx Digital), a FilmLight Blackboard 2 and a Sony X300 monitor. For the most part, this is a great system for grading at home. However, Social Distance was quite “heavy” in terms of playing back the dozens of mattes on every shot, so I was able to set up a socially distant suite at the Light Iron office. This allowed me to use a Baselight Two (more horsepower) for the review sessions I had with the producers. We did review sessions for producers over Streambox and had up to 10 people for those sessions, all at their own homes using iPads.
For the uninitiated, what’s the difference between a standard color grade and an HDR color grade?
Vertovec: An HDR color grade is a special color space designed for a display system that is capable of much higher maximum brightness. Instead of just having an overall brighter picture on a brighter display, HDR displays are meant to show more image detail in that brighter area, while the normal and darker areas of the image remain the same.
The term “dynamic range” is used to refer to the degree of difference between the darkest image detail and the brightest image detail. Modern cameras are actually capturing more dynamic range than most SDR TVs can display. HDR is a display technology that is delivering a much closer representation of what the camera actually recorded.
Production started in late June. When did you get the pilot and subsequent episodes to grade, and how long did it take to grade them?
Vertovec: I think we graded in July and August. I had six hours unsupervised per episode, and we had a review half-day when we played down two episodes for the producers each week.
Who set the “look” of the show? The director? The cinematographer?
Vertovec: I had a few conversations with Pedro (Luque), as he was the first DP on board. I built a show LUT from those conversations and sent a LUT to Light Iron dailies colorist Greg Pastore. Pedro and I talked about trying to get the show to still look “real” but also a little bit special, so I added a slight softness to the skin tones and softened the tone with some cool shadows. Just because we wanted the show to look “real” doesn’t mean we have to try to make it look stressed or deteriorated in some way. People shoot wonderful-looking material on their phones every day.
For mastering, Netflix recommends that monitors are set to and masters stored in P3-D65 color space rather than Rec.2020. Again, for the uninitiated why do that?
Vertovec: There are actually no monitors that can fully display the entire Rec. 2020 gamut. What most displays do is display as much information as they can and naturally clip any image information that exists beyond their ability.
So while Rec. 2020 is a specification designed to be future-proof for many years to come, it would be bad practice to grade in a color gamut that you could not actually visualize on your device, as there could be artifacts present in your image that your display would be clipping, which the colorist would not be able to control. So best practice is to grade in P3D65 and deliver either P3D65 files or Rec. 2020 files limited to P3D65.
Were there multiple deliverables to Netflix? Was there a separate grade for non-HDR folks?
Vertovec: This is the whole idea behind Dolby Vision — we only have one deliverable. We grade the HDR master at PQ 1000 nits and create trim metadata for dynamic ranges below 1000 nits. So if your TV is capable of doing only 100 nits or 200 nits, the Dolby Vision embedded metadata will tell your TV how to display the show in the most optimum way. Regular non-HDR (SDR) is standardized at 100 nits.
Were you grading mixed-footage episodes? Can you talk about grading the C300 footage versus iPhone Log footage?
Vertovec: Baselight has a very in-depth color-management architecture, so no matter what input color space or luma encoding it’s given, it will transform that data to a chosen working color space and luma encoding.
With this color-management system in place, it is very easy to work with material from different sources and set looks across formats — with the main difference being the iPhone footage. Even though the Filmic Pro app was set to Log recording, I found the files behaved much more like flat-pass Rec. 709 than true scene-referred Log. So even though we were working in HDR, they were effectively SDR source files that we upconverted to HDR as part of the grade.
How does a display format like HDR make for better storytelling in general and on this show in particular?
Vertovec: I am a huge fan of HDR as a medium and of its ability to create a more immersive experience. Having said that, this show in particular is about people connecting — or failing to connect — over their devices. That is, over SDR devices. So I think we downplayed our extended range and tried to stay honest to the experience the characters in our show would have had.
Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.