By Ben Mehlman
Since the news broke in late 2017, Amazon’s gargantuan return to J. R. R. Tolkien’s The Lord of the Rings universe has been one of television’s most anticipated events. Instead of setting The Rings of Power during a familiar era, showrunners Patrick McKay and John D. (JD) Payne set their series during the Second Age of Middle-earth, which is thousands of years before The Hobbit and The Lord of the Rings take place. This decision allowed the story room to breathe while also staying true to Tolkien’s beloved lore.
Helping craft the epic saga’s visual palette is veteran producer Ron Ames, whose long list of credits include Avatar, Star Trek Into Darkness, Avengers: Age of Ultron and The Aviator. To accomplish the series’ sophisticated technical needs, Ames knew he needed to embrace the cloud for all aspects of the workflow, so he reached out to Blackmagic Design to partner on pipeline options.
Ames recently took time out to talk to us about tackling a show with over 9,000 VFX shots, the partnership with Blackmagic, how he and his team were able to get unlimited cloud storage from Amazon and more.
Tell us about your role as a producer on the project and when you came on board.
I was one of the first hires. Creative executive producer Lindsey Weber and I had worked at J.J. Abrams’ shop, Bad Robot. I was there for a number of years as a VFX producer and post supervisor and had worked with J.J. building methodology. I also knew Patrick and JD as young writers because they worked on Star Trek III for a bit. So when Lindsey got this job, I was one of her first hires.
I was a producer of all the technical departments, everything from camera capture to exhibition — meaning color, VFX, sound, music, editorial were all departments in my purview. Weirdly (laughs), that included prosthetic makeup as well.
Can you talk about your team?
I immediately hired ILM VFX supervisor Jason Smith, post supervisor Jake Rice (who became one of our producers) and Jesse Kobayashi, a super-talented VFX producer. Together that became the core of our technical team.
A big benefit at the beginning was that we were a really small team, and the writer’s room was down the hall from us in the production and technical offices. On top of that, Patrick and JD were really open to the technical filmmaking side, and they truly had faith in the process. They’d write a scene and come to talk to me, Jason or the team about how we should do something. We’d then start our research and figure it out.
Scale was one of the biggest things we had to deal with. How are we going to show scale procedurally on-set between tall and short characters in the same frame? So we decided to use motion control as a first unit tool, even though no one’s ever done that before.
That’s quite the undertaking.
It was. We also had to ask ourselves, “How are we going to deal with 9,000 visual effects shots?” Previously, in terms of VFX, the biggest movie I’d ever done was Avatar, which was about 2,600 VFX shots. Avengers: Age of Ultron was about 2,300. The size and scope of this show was jaw-dropping.
We knew we had to lean into technology and modern filmmaking tools while also staying true to the heart of the story. This included creating real characters and having magic that had physics and rules to it. We even had a person whose entire job was to be our Tolkien expert. We wanted to make sure the creative process wasn’t siloed and that everyone was truly collaborating.
You had to oversee over 12 VFX studios and 1,500 VFX artists to complete those 9,000 shots. What’s that workflow look like? How do you foster collaborative creativity on that scale?
We knew the only way this would work was if we created a standard where we were all in this together. The price of admission to the vendors was that they had to work collaboratively with everyone.
ILM and Weta knew that if they were going to work together, they had to be able to share assets. They created this standard where every asset could be shared, and every other vendor followed their lead. If somebody got stuck, we could help. We created an “all for one and one for all” Lord of the Rings idea of friendship.
Everybody wanted to come to Middle-earth. Some artists even traveled across the world to work at the vendors’ studios. It was a fun way of working, and everyone was so open and collaborative, but we also purposefully cast each company for their particular skillset.
Getting back to your question of how do you do that? You build an industrial system that encourages creativity, doesn’t close off artistic discovery, and mechanizes the ability to share, store and keep these processes alive. To do this, we had to be fully cloud-based.
Where did all the assets live?
Again, we knew this production was going to be unlike anything else. So in my first two weeks, I met with the production technology team at Amazon Studios and asked them to help us become fully cloud-based. They thought for a moment and said yes. I then asked, “Can we have unlimited cloud-based storage?” Once again, they said yes.
We were given the keys to the kingdom. AWS (Amazon Web Services) and Amazon Studios knew this workflow would be the wave of the future, but many of the tools weren’t built yet. So we built the tools to do this in the cloud, and everybody had their own S3 bucket they could push and pull their assets to and from. It was 100% secure, and we didn’t have one corrupt frame or lost image.
We built automated systems to push and pull by a small team of engineers and people at Moxion [which was recently purchased by Autodesk]. We didn’t know how to do it; they didn’t know how to do it. We just said we’re going to figure it out, and it was fantastic. They made something that could be managed by a small group of people from afar. Then COVID hit, but we were already built to work collaboratively from home.
How much storage did you need?
The idea of only putting it in a cloud and it being safe was nerve-wracking to us, but we tested it many times over and it never failed. We still backed it up eight times, which is something I wouldn’t do again. It came out to 8 petabytes of material. We were so insanely redundant.
Let’s talk about your partnership with Blackmagic and what that brought to the workflow.
Something I’ve wanted to do for a long time was to color-correct from afar. The first time I did it was on Shutter Island. Martin Scorsese, DP Robert Richardson (ASC) and I were working from Technicolor NY, while the files were in Burbank with the colorist Yvan Lucas. It was a nightmare because it was impossible to tell if the color was correct.
For this show, I asked Blackmagic and Company 3 to come up with a cloud-based solution where the files, because of security, wouldn’t be required at any one place. Instead, we would keep them in an S3 bucket and point to it. This meant Blackmagic Design had to basically come up with a virtual DeckLink card that took a signal and routed it into a monitor. I then asked them if it could be up and running in nine months. They said yes, and it was ready for testing literally nine months to the week.
By the time we started doing color, it was working. In New Zealand, we had the same Sony X310 reference monitor and lighting conditions as our colorist, Skip Kimball, who was based in Idaho and working on Resolve. Plus, we had everyone from Company 3 updating everything in Los Angeles. We did color shot by shot in real time, knowing that we were all looking at the exact same color with no latency or concern about calibration. It was insane.
To my knowledge no one has ever done this, where your original camera files stay securely in the cloud, and you never have to touch them. No local files or local storage for Company 3, which is a giant cost savings. And when we finished the show, we did everything with virtual computers using everyday laptops.
I’ve been working with Blackmagic Design for pretty much my whole career — helping design gear, equipment and methodologies. They’ve always been really open to testing and trying stuff, and it wasn’t only color on this one. I told them I wanted to use Blackmagic Design tools in our pipeline from camera capture to finish. We used the Blackmagic 6K as witness cameras, eyeline cameras and even as principal capture cameras on-set for sprite and VFX work. All of our review rooms were completely wired with Blackmagic Design studio equipment, including UltraStudio 4K for playback and Smart Videohubs for routing formats.
We also used Resolve for our VFX editing and conform, allowing us to conform all eight hours of our show as we went without having to wait. Our editors (Cheryl Potter, Jochen FitzHerbert, Stefan Grube, Jaume Martí and Bernat Vilaplana) were able to look at finished episodes. Our whole pipeline was built upon the idea that at any time, we could conform a scene, sequence or episode and show it to the showrunners with appropriate color temps.
That’s the other thing — in addition to the 9,000 VFX shots cooking, we also did thousands of temps through Epic Games Unreal Engine and previz. Creating all this seamlessly with the technical allowed the creative to flow. The director and cameraman to come into our virtual production room — which we did with Third Floor — and preshoot their scenes with the previz guys acting it all out.
Any other hardware or software that was especially helpful?
We had to track everything in a way that could be easily searched. If you looked up “red bearded dwarf,” you could find everything about Durin. So we used Moxion and ShotGrid. This is one of the reasons Autodesk bought Moxion, because it’s a publishing tool. We then used ShotGrid as an asset-tracking and VFX-sharing tool. Between those two products, we created a whole system that allowed us to link assets based on a number of different and key parameters, such as day, sequence, scene number, etc.
The reported budgets on each episode were large but not infinite. How do you balance creating epic VFX while staying on budget?
This is exactly the point. If you’re tracking everything so exquisitely, and the director or showrunner asks about doing something, you can immediately say, “If we do that, you’re going to have to give something else up because we don’t have the budget to finish everything.”
That’s why this kind of control is so important to the creative process. It allows us to say in advance, within a percentage point of accuracy, what the cost of something will be. This then allows the creator to make the most informed decision. It’s not our job to say what’s most important. It’s all important to us. We were tracking where we were across every episode at any given time. Jake Rice was using Airtable to create mosaics of which scenes were completed in an episode, allowing us to move priorities around between episodes.
What we’re doing is a new form of TV, where our VFX has to be as good as any theatrical film. And the proof was in the pudding when we released Episodes 1 and 2 as free theatrical IMAX events. It looked and sounded fantastic in the theater.
There is obviously a lot of CG but also a lot of practical work. How did you decide which to use?
Anything we could do practically, we did — though the bar is very high when doing prosthetic makeup in 4K UHD. We’re up against every seam or set crack. Everything that doesn’t look perfect might need to be touched up with VFX as well.
For prosthetics of the Orcs, we didn’t want them to be cookie cutter. We had the guiding principle that nobody here is fully evil; this was a race of beings, and Tolkien told us they procreate like humans. We had women Orcs, baby Orcs, male Orcs, and we created a whole culture… a harsh culture with its own rules.
The design was then created with that aesthetic in mind, knowing that things like a mask in the background can be animated later to look more real, while somebody in the foreground would have to have makeup as good as we could possibly do it. That’s the approach we took to each character’s look and tone.
What was the most difficult VFX sequence and why?
It was probably Lindon for me. We originally found a fantastic redwood forest that had been planted, with trees 200 feet tall. And since those trees aren’t native to New Zealand, there were no animals. So we had this immaculate forest with amazing lighting that was perfect, but it was three hours away.
So we had to build Lindon on a stage. We came up with the idea of a white tree golden forest. We modeled it after birch trees, quaking aspen and small brush, but you can only build so much on a set. This meant we had to extend in every direction, and that was hard. We had to make the trees shimmer, with light coming through in a way that doesn’t look like studio lights. Every shot, every leaf, every detail was handmade, and it was done spectacularly by ILM.
Another extremely complex sequence, done by Weta, was Khazad-dûm. The exterior was real… the giant cliff with all the waterfalls. We went there in helicopters to shoot background plates. Then inside was a frag set that was maybe 20 feet, with the rest being a digital extension. We even went underground to do photogrammetry and still photography of these fantastic caves in New Zealand and used that to create the rock formations and everything that is in Khazad-dûm.
Ben Mehlman, currently the post coordinator on the Apple TV+ show Presumed Innocent, is also a writer/director. His script “Whittier” was featured on the 2021 Annual Black List after Mehlman was selected for the 2020 Black List Feature Lab, where he was mentored by Beau Willimon and Jack Thorne.
Content Sponsored by Blackmagic Design.