Sony Pictures’ Missing is a story told almost entirely through computer screens and smartphones. This mystery thriller, streaming now, was directed by Nicholas Johnson and Will Merrick and follows a teen girl named June (Storm Reid), whose mother (Nia Long) goes missing while on vacation with her new boyfriend (Ken Leung). Stuck thousands of miles away in LA, June uses technology to find her mom before it’s too late.
The filmmakers relied on cloud-based and AI-powered tech to tell their story. Editors Austin Keeling and Arielle Zakowski chose Adobe Premiere, After Effects and Frame.io to edit, build shots and design thousands of graphics simultaneously. The complex and VFX-heavy workflow was custom-built to make the audience feel as if they’re logging in, clicking and typing along with the characters in real time.
Let’s find out more from the editors…
How early did you get involved in the film?
Austin Keeling: We both got started about six months before principal photography began, so we were some of the earliest crew members involved. We spent those first months creating a previz of the entire film by taking temp screenshots of apps on our own computers (we were working from home at the time) and building each scene from scratch.
The directors would take pictures of themselves and record themselves saying all the lines, and we would slot those into the previz timeline to create a sort of animated storyboard of each scene. By the time we were done, we had a completely watchable version of the entire movie. This was a great time to test out the script to see what was working and improve things that weren’t. Nick and Will were still writing at the time, so they were able to incorporate discoveries made in the previz stage into the final script.
This is not your typical film. What were the challenges of telling the story through screens and smartphones?
Arielle Zakowski: This film was unlike anything either of us had ever worked on before. At first the challenges were mostly technical ones. We hadn’t had much experience with Adobe After Effects, so we had to teach ourselves to use it pretty quickly. And none of the film is actually screen-recorded — it’s all built manually out of layered assets (desktop background, Chrome windows, various apps, mouse, etc.), so in some scenes, we were juggling up to 40 layers of graphics.
Once we became comfortable with the technical side of the process, we really dove into the challenges imposed by the unique screen perspective. It gave us a whole new set of tools and cinematic language to play with — building tension with nothing more than a mouse move, for example, or conveying a character’s emotion simply through how they type a message. Ultimately the limitations of the computer screen forced us to make more and more creative storytelling choices along the way.
What direction were you given by Will and Nick?
Keeling: They were very much involved in the post process from day one. They had already edited the previous film in this series, Searching, so we leaned heavily on them in learning the screen-film workflow.
In the previz stage, each of us would take a scene and build it from scratch and then send it to the directors for notes.
From that point on, it became a constant collaboration, and when we moved into a traditional office after principal photography, the directors were with us in the editing rooms every day. They wanted this film to feel bigger than Searching in every way, so they really encouraged us to try new things in the pacing, coverage, transitions, etc. They had a wealth of knowledge about how to tell a screen-life story, so working with them was creatively inspiring.
Was the footage shot traditionally and then put into the screens? If traditionally, was it then treated to look like it’s on phones?
Zakowski: All the footage was shot traditionally and then added into the screen graphics in post. Our cinematographer Steven Holleran used a total of eight different cameras to create a realistic feel for the multiple video outputs we see in the news footage, FaceTime calls, security cameras and iPhones.
Once the footage was incorporated into the graphical elements, we added compression and glitches to some of the footage to further replicate the experience of seeing footage on a laptop screen.
There is a lot happening on the screen. How did you balance all of it to make sure it wasn’t distracting to the viewer?
Keeling: This is partly why editing a screen movie takes so much time. We built the entire computer desktop in a wide shot for each scene and then used adjustment layers to create pans, zooms and close-up shots.
We essentially got to choose how to “cover” each scene in the edit, which allowed for nearly endless possibilities. We were able to tweak and alter the scenes in tons of ways that aren’t possible in a traditional film. We relied a lot on feedback to make sure that the audience wouldn’t get lost along the way. Through multiple test screenings, we were able to figure out which beats were distracting or unclear and then push to find the simplest, most effective way of telling the story.
Do you think the story could have been told in a more traditional way? How did the use of screens and phones help ramp up the drama/mystery/suspense?
Zakowski: The mystery at the core of this movie is thrilling enough that it could probably work in a traditional format, but we think the screen-life storytelling elevates this film into something unique and timely. Watching someone dig around on the internet isn’t inherently thrilling, but by putting the audience in June’s POV and letting them find the clues along with her, we’ve created a fully immersive and intimate version of the story.
Probably everyone has felt some dread before opening an email or anticipation while waiting for a phone call to go through. This format allowed us to really explore the relatable emotions we deal with as we use technology every day.
You edited in Premiere. Why was this system the right one to tell this story?
Keeling: We used Adobe Creative Cloud from start to finish. We edited in Premiere Pro using Productions so we could easily move between projects and share timelines with each other and with our assistant editors. All of the final graphics were made in Illustrator and Photoshop. We used Dynamic Link to send the locked film to After Effects, where we added tons of finishing details. And we used Frame.io to share cuts with the directors and the studio, which made it so easy to get notes on scenes.
We needed programs that were intuitive and collaborative, ones that made it possible to move the film seamlessly from one stage to the next.
Can you talk about using cloud tech and AI on the shots and graphics and the tools you used? What was your workflow?
Zakowski: Because we edited the previz during the pandemic, we relied heavily on cloud-based servers to share projects and assets while working from home. We actually used surprisingly few AI tools during the edit — most of the film was created with straightforward, out-of-the-box Adobe products. The unique nature of this film allowed us to use a lot of morph cuts in the FaceTime footage to combine takes and adjust timing.