By Karen Moltenbrey
There was a lot of buzz before — and after — this summer’s release of Disney’s remake of the animated classic The Lion King. And what’s not to love? From the animals to the African savannas, Disney brought the fabled world of Simba to life in what is essentially a “live-action” version of the beloved 1994 2D feature of the same name. Indeed, the filmmakers used tenets of live-action filmmaking to create The Lion King, and themselves call it a visual effects film. However, there are those who consider this remake, like the original, an animated movie, as 2019’s The Lion King used cutting-edge CGI for the photoreal beasts and environments.
Whether you call it “live action” or “animation,” one thing’s for sure. This is no ordinary film. And, it was made using no ordinary production process. Rather, it was filmed entirely in virtual reality. And it’s been nominated for a Best Visual Effects Oscar this year.
“Everything in it is a visual effect, created in the same way that we would make a visual effects-oriented film, where we augment or create the backgrounds or create computer-generated characters for a scene or sequence. But in this case, that spanned the entire movie,” says VFX supervisor Rob Legato. “We used a traditional visual effects pipeline and hired MPC, which is a visual effects studio, not an animation house.”
MPC, which created the animals and environments, crafted all the elements, which were CG, and handled the virtual production, working with Magnopus to develop the necessary tools that would take the filmmakers from previz though shooting and, eventually, into post production. Even the location scouting occurred within VR, with Legato, director Jon Favreau and others, including cinematographer Caleb Deschanel, simultaneously walking through the sets and action by using HTC Vive headsets.
The Animations and Environments
MPC, known for its photorealistic animals and more, had worked with Disney and Favreau on the 2016 remake of The Jungle Book, which was shot within a total greenscreen environment and used realistic CG characters and sets with the exception of the boy Mowgli. (It also used VR, albeit for previsualization only.) The group’s innovative effort for that work won an Oscar for visual effects. Apparently that was just the tip of the spear, so to speak, as the team upped its game with The Lion King, making the whole production entirely CG and taking the total filmmaking process into virtual reality.
“It had to look as believable as possible. We didn’t want to exaggerate the performances or the facial features, which would make them less realistic,” says Legato of the animal characters in The Lion King.
The CG skeletons were built practically bone for bone to match their real-life counterparts, and the digital fur matched the hair variations of the various species found in nature. The animators, meanwhile, studied the motion of the real-life animals and moved the digital muscles accordingly.
“Your eye picks up when [the animal] is doing something that it can’t really do, like if it stretches its leg too far or doesn’t have the correct weight distribution that’s affecting the other muscles when it puts a paw down,” says Legato, contending that it is almost impossible to tell the CG version of the characters from the real thing in a non-talking shot or a still frame.
To craft the animals and environments, the MPC artists used Autodesk’s Maya as the main animation program, along with SideFX Houdini for water and fire simulations and Pixar’s RenderMan for rendering. MPC also used custom shaders and tools, particularly for the fur, mimicking that of the actual animal. “A lion has so many different types of hair — short hair around the body, the bushy mane, thick eyebrow hairs and whiskers. And every little nuance was recreated and faithfully reproduced,” Legato adds.
MPC artists brought to life dozens and dozens of animals for the film and then generated many more unique variations — from lions to mandrills to hyenas to zebras and more, even birds and bugs. And then the main cast and background animals were placed within a photoreal environment, where they were shot with virtual cameras that mimicked real cameras.
The world comprises expansive, open landscapes. “There were many, many miles of landscapes that were constructed,” says Legato. The filmmakers would film within pockets that were dressed and populated for different scenes, from Pride Rock to the interior of a cave to the savanna to the elephant graveyard — all built in CGI.
“Everything was simulated to be the real thing, so the sum total of the illusion is that it’s all real. And everything supports each other — the grounds, the characters, what they are physically doing. The sum total of that adds up to where your brain just says, ‘OK, this must be real. I’ll stop looking for flaws and will now just watch the story,’” says Legato. “That was the creative intent behind it.”
Virtual Production
All the virtual camera work was accomplished within Unity’s engine, so all the assets were ported in and out of that game engine. “Everyone would then know where our cameras were, what our camera moves were, how we were following the action, our lens choices, where the lights were placed … all those things,” says Legato.
Magnopus created the VR tools specific for the film, which ran on top of Unity to get the various work accomplished, such as the operation of the cameras. “We had a crane, dolly and other types of cameras encoded so that it basically drove its mate in the computer. For instance, we created a dolly and then had a physical dolly with encoders on it, so everything was hand operated, and we had a dolly grip and a camera assistant pulling focus. There was someone operating the cameras, and sometimes there was a crane operator. We did SteadiCam as well through an actual SteadiCam with a sensor on it to work with OptiTrack [motion capture that was used to track the camera],” explains Legato. “We built a little rig for the SteadiCam as well as one for a drone we’d fly around the stage, and we’d create the illusion that it was a helicopter shot while flying around Africa.”
Because the area within VR was so vast, a menu system was created so the filmmakers could locate one another within the virtual environment, making location scouting much easier. They also could take snapshots of different areas and angles and share them with the group. “We were standing next to each other [on stage], but within the virtual environment, we could be miles apart and not see each other because we’re maybe behind trees or rocks.”
As Legato points out, the menu tool is pretty robust. “We basically built a game of film production. Everything was customizable,” he says. Using iPads, the group could play the animation. As the camera was in operation, they could stop the animation, wind it backward, speed it forward, shoot it in slow motion or faster motion. “These options were all accessible to us,” he adds.
Legato provides the following brief step-by-step overview of how the virtual production occurred. First, the art department created the sets — Africa with the trees, ponds, rivers, mountains, waterfalls and so forth. “Based on the script, you know somewhat where you need to be [in the set],” he says. Production designer James Chinlund would make a composite background, and then they — along with Favreau, Deschanel and animation supervisor Andrew Jones — would go into VR.
“We had built these full-size stationary chess pieces of the animals, and in VR, we’d have these tools that let us grab a lion, for instance, or a meerkat, and position them, and then we’d look through the lens and start from there,” says Legato. “We would either move them by hand or puppeteer a simple walk cycle to get the idea of the blocking.”
Jones and his team would animate that tableau and port it back into the game engine as an animation cycle. “We’d find camera angles and augment them. We’d change some of the animation or slow it down or move the animals in slightly different positions. And then we’d shoot it like it’s on a live-action stage,” explains Legato. “We’d put a dolly track down, cover the action with various types of lenses, create full-coverage film dailies… We could shoot the same scene in as many different angles as we’d wish. We could then play it out to a video deck and start editing it right away.” The shots they liked might get rendered with more light or motion blur, but a lot of the time, they’d go right off the video tap.
Meanwhile, MPC recorded everything the filmmakers did and moved — every leaf, rock, tree, animal. Then, in post, all of that information would be reconverted back into Maya sets and the animation fine-tuned.
“In a nutshell, the filmmakers were imparting a live-action quality to the process — by not faking it, but by actually doing it,” says Legato. “And we still have the flexibility of full CGI.”
The Same, But Different
According to Legato, it did not take the group long to get the hang of working in VR. And the advantages are many — chief among them, time savings when it comes to planning and creating the sequence editorially, and then instantly being able to reshoot or iterate the scene inexpensively. “There is literally no downside to exploring a bold choice or an alternate angle on the concept,” he points out.
Yes, virtual filmmaking is the future, contends Legato.
So, back to the original question: Is The Lion King a VFX film or an animated film? “It’s perhaps a hybrid,” says Legato. “But, if you didn’t know how we did it and if the animals didn’t talk, you’d think it was done in the traditional manner of a live-action film. Which it is, visually speaking. You wouldn’t necessarily describe it as looking like ‘an animated film’ because it doesn’t really look like an animated film, like a Pixar or DreamWorks movie. By labeling it as such, you’re putting it into a hole that it’s not. It’s truly just a movie. How we achieved it is immaterial, as it should be.”
Legato and his colleagues call it “live action,” which it truly is. But some, including the Golden Globes, categorized it as “animation.” (They also called 2015’s The Martian and 2010’s The Tourist “comedies.”)
Call it what you will; the bottom line is that the film is breathtaking and the storytelling is amazing. And the filmmaking is inventive and pushes traditional boundaries, making it difficult to perhaps fit into a traditional category. Therefore, “beautiful,” “riveting,” “creative” and “innovative” might be the only descriptions necessary.
Karen Moltenbrey is a veteran writer, covering visual effects and post production.
Check out MPC’s VFX breakdown on the film: