NBCUni 9.5.23

Turning LA into NYC for Apple TV+’s The Morning Show

By Daniel Restuccio

If you watched the Apple TV+ series The Morning Show and believed the New York City scenes were shot in the Big Apple, that makes co-producer and visual effects supervisor Marc Côté very happy. In fact, the vast majority of the show was shot in Los Angeles, accounting for a big chunk of the 3,300-plus visual effects in the first season. “That was the goal from the beginning,” says Côté. “We wanted to give the impression that everything was completely done in New York.”

L-R: Marc Côté and Steve Bannerman

For those of you who haven’t seen the series, it follows a Today-like morning show, whose male co-host (Steve Carell as Mitch) is embroiled in a sexual harassment scandal. This leaves his popular co-host (Jennifer Aniston as Alex) scrambling to control the damage, while using the opportunity to fill the empty seat next to her with another female (Reese Witherspoon as Bradley).

Côté and his partner Steve Bannerman recently merged their two DI, editorial and VFX companies — Montreal’s Real by Fake and LA’s Local Hero — which will soon be known as Real by Fake LA. Their credits include Sharp Objects, Little Fires Everywhere, Watchmen, Runaways: Season 3, Black Summer and Barskins. The duo was tapped for The Morning Show by production designer John Paino and executive producer/actor Witherspoon, having previously worked with them on Big Little Lies and the feature film Wild.

For its work on the series, Real by Fake provided the production with turnkey post by bundling editorial, audio post and visual effects services. To facilitate the workflow, Real by Fake connected its two main locations by a high-bandwidth Teradici PCoIP system, making all its servers seamlessly accessible to each other. They also used an ftrack/cineSync setup for client review, as well as a Haivision Makito Rec. 2020 4K review system for DI and realtime, full-quality reviews. (Check out the studio’s The Morning Show reel here.)

             

The show’s 3,300-plus VFX shots broke down into categories:
• 3D/2.5D — set extensions for the replacement of landscapes, windows, buildings, streets, cars, etc.
• CG and 2D elements — enhancing the environment, fire, sky, cityscape, matte paintings.
• An LED car rig – using stock footage and VFX plates.
• EditAdvance – Real by Fake used proprietary technology for morphing, invisible cuts, combined performance layers, Repo and retimes.
• Location and period cleaning and enhancement and monitor/screen replacements.
• LED moiré pattern effect cleanup.

All shots were addressed in the offline. This was done to provide the director’s cut with pre-comps to enhance creative decision-making and to prevent the audience from being thrown out of the story. All plate shoots were planned and shot as needed throughout principal photography. All heavy CG shots were further organized by deadline, then comped only (versus needing CG/matte painting elements), followed by shots with mattes. Shots needing more stages of approvals or defined looks were given higher priority.

             

The VFX team used a combination of Autodesk Maya, Side Effects Houdini, Foundry Nuke and Blender to build all the shots. It took a team of over 100 artists five months — from April to August 2019 — to deliver.

“We found something very interesting in Blender,” explains Côté. “You can do previews in real time, so the fluidity and the simplicity of the interface reminds me of my time working with Softimage. Blender has come a long way since its creation, and it has reached the necessary level of maturity to be used in a production pipeline.”

One of director Mimi Leder’s requirements was that the locations be as visually accurate as possible. So Côté flew to NYC, where he supervised photogrammetry of all locations (taken during different times of the day) and helicopter flyovers of the Manhattan skyline. He wanted to create 3D environments in Maya that matched architecturally with the scripted locations. This process allowed accurate dimensions and data to import into the game engine (Blender) to previz all the scenes. The team completed a full day of helicopter flights over Manhattan to shoot establishing shots (using an ARRI Alexa Mini) and to capture background plates and photo surveys. All external locations (Alex’s penthouse, the UBA building and Bryant Park) were surveyed to have an accurate 3D representation of the surrounding buildings and to catch the different light ambiances.

One of the big challenges with this show is that it was shot in 8K on the Panavision Millennium DXL2 with Panavision Primo 70 lenses. Normally for episodics, visual effects are produced 1920×1080 and routinely up-scaled for a 4K master. The visual effects workflow on The Morning Show was full 4K HDR.

4K Editing
“To simplify the multi-location workflow and the realtime render-sharing of all editors (six editors and six assistants), we decided to downscale all the raw footage to DNX36 during the dailies process.

“There are lots of times that the viewer sees cameras on the set of TMS,” explains Bannerman. “Many times they see an image on the screen of the camera itself (as though they were the operator). Those cameras were Ursa Mini Pros. Sometimes the feed from the Ursa Mini Pros would also show up on monitors that viewers can see around the studio.”

All six editors — Carole Kravetz Aykanian, Vikash Patel, Ron Rosen, Elliott Eisman, Peter Ellis and Aleshka Ferrero — worked from the Sony lot connected to the two main hubs in Montreal and Los Angeles. Both hubs had the same footage.

4K VFX Workflow
Côté and Bannerman had previously used a 4K VFX workflow on Netflix’s Black Summer. “What we learned on that show,” says Bannerman, “was how to set up a full Dolby Vision 4K HDR VFX and DI pipeline. For The Morning Show, all VFX were done in the original format of the cameras (8K, 4.6, 2K) using our own pipeline IP, so the compositors could work on the material in the correct color space. Only at the time of output was the material reformatted into the final 4K (4096) resolution and ACES color space.”

During the entire process, Real by Fake had all source material available at its office in Montreal. All the pre-comps done for the offline were done directly on the original source and not in DNX36 on the Avid Media Composer, which was used to edit the show, with Nexis being used for storage. This process was to ensure they were always moving forward on a shot and not doubling the work.

Combining its offline and VFX expertise allowed Real by Fake to create temp VFX shots requested by editorial and replace the effects directly in the Media Composer bins, making it seamless for the editors’ work process as well as facilitating the creative decision-making. Unlike a normal VFX production, wherein shots are done when the picture is locked, with no capacity of seeing the global dynamics of a shot, Real by Fake constantly fed the offline editors with pre-comps, all the while proposing different cost-saving alternatives. For example, the team routinely combined takes to reconstruct a scene that never actually existed in camera to really aid the storytelling process.

“Having access to all the raw footage was a must for this project” says Côté. “On another more recent project, we were saved during the COVID quarantine because we had all the raw footage, and we were able to do our VFX pulls using Foundry’s Hiero while Company 3 was closed.”

 

     

Car Shots
One of the most challenging categories of shots for The Morning Show was the driving-in-the-car scenes. “New York City at night is a light show,” explains Côté. “Traditionally, you put people in the car on a greenscreen background and have grips with flashlights trying to put some light inside the car. Instead, we set computer-controlled LED panels all around the car. We also decided to use greenscreen because of the contrast ratio in the night shots and mostly because we wanted to give production as much latitude in the offline process to choose the right background for their need. What you see behind the glass inside the car is greenscreen, but the light that was coming inside was from the LED panels. We had this nice interaction and movement flashing the light inside on the faces of the actors and on all the reflective material.”

Matching Lenses
The team did many tests to try to match the lenses and the bokeh that the DP was getting in-camera. “If you have a shallow depth of field, obviously your eye is drawn toward the actor or the action you want. This allows you to take the background out of focus. But if you have bokeh, they need to be sharp, so you need to match them accurately. We created a recipe that matched all of the lenses in Foundry’s Nuke. The script was directly connected to the 3D depth, defocus and lens size to perfectly match its sharpness. We added the lens aberration and dust to maintain the illusion. What we see is always imperfect, and to reproduce it makes it real.”

California Burning
Besides the sheer scale of the show and its limited turnaround time, one of the greatest challenges was burning California in Episode 6. Part of the problem was that just a few weeks before the shoot, there was an actual wildfire at the shooting location. Right before they started shooting, the formerly charred ground was already turning green.

“All the actors you see, all the surrounding landscape, everything is a matte painting or 3D CG ground,” explains Côté. “We also comped in a few fire rigs, but most of the material is either footage we shot with cinematographer Dave Stump or CG particles with Houdini, creating fire and smoke — whatever needed to be done to be able to make it look like real fire. When you work on these types of shots, elements are all affected in different ways by the depth of particles. We applied 3D photogrammetry depth-tracking to create a more realistic smoke haze through all shots. We were lucky that John Paino provided us a ton of references and surveys from the actual wildfires so we could have as much info as possible. The beach scene was inspired by actual images from the famous Woolsey wildfire.”

Realistic VFX
Côté says the challenge of creating realistic VFX is to be able to observe the world on a day-to-day basis so that when you’re looking at something like visual effects, you say, “This makes sense. This has the right perspective. How I can help with technology or with physics to be able to achieve something that will look great?”

“After that, the director comes and says, ‘Ah, we’re making a movie, so let’s push the envelope.’ The ultimate experience is to combine production and VFX together in harmony during preproduction, production and post,” says Côté. “We need producers and unit production managers that understand that cost-effectiveness comes from the harmony between production design and VFX. Technologies move fast, and everyone needs to keep an open mind to be able to bring the vision of the directors and screenwriters to life.”

There will be a Season 2 of The Morning Show, but like all productions right now, we don’t know when.


Dan Restuccio is a writer/director with Realwork Entertainment and part of the Visual Arts faculty at California Lutheran University. He is a former Disney Imagineer. You can reach him at dansweb451@gmail.com.


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.