NBCUni 9.5.23

The Evolution of ICVFX: ILM Stagecraft and Dimension

By Beth Marchant

Although it might seem like a newer phenomenon, virtual filmmaking has been quietly idling on the sidelines of film and TV production for a very long time. In tech circles, it has been called many things and taken many forms. It’s been used for everything from simple, one-off shot setups like The Matrix‘s “bullet time” sequences to entire films, like James Cameron’s Avatar. Both of these examples were groundbreaking at the time, both technically and culturally. But few really understood what virtual production could mean to the industry as a whole.

When Jon Favreau’s live-action masterpiece The Mandalorian burst into the world with Disney+ in 2019, suddenly everyone and their brother knew what virtual production was and had the broader potential to do.

Even as virtual production is more widely adopted, many misconceptions about the still-evolving platform persist — namely that virtual production only makes sense for those with the biggest budgets. As ILM and Dimension Studios show here, there are many ways to scale a virtual stage. Virtual production is, above all, about flexibility, as suitable and scalable for short-form commercial spots and virtual sports avatars as it is for TV series and big-budget movies.

ILM StageCraft Leads the Way

If you’d asked anyone 10 years ago if visual effects powerhouse Industrial Light & Magic would one day transform TV, they might have answered, “Isn’t making billion-dollar movies enough?” The trailblazing studio founded by George Lucas and now a subsidiary of Disney has largely pioneered and dominated the category in tentpole films for nearly 50 years. ILM is legendary for its R&D, having given us everything from film scanning, motion control and morphing to in-camera effects and Photoshop. (ILM COO John Knoll was one of Photoshop’s original creators.) Is it any wonder that when the streaming revolution and pandemic upended Hollywood production, ILM was there to meet the new demand for effects-heavy content?

We might not have expected something as mind-blowing as The Mandalorian, the first live-action Star Wars show ever made for the small screen, but the industry was more than ready for a paradigm shift across episodics and film. Once again, ILM ushered in a new era of end-to-end virtual filmmaking that is tailor-made for these times.

StageCraft, the studio’s long-in-development answer to virtual production, is both a proprietary platform and a set of LED stages. Known as “the volume,” the wall-to-wall screens generate real-time backgrounds and effects — driven by a game engine — that work in tandem with synchronized practical lights on a practical set. Gone are the goofy, dotted mocap suits and blank, desolate greenscreens. Instead, cast and crew are immersed in the show’s world from start to finish.

Directors love the way they can execute their vision in the volume. “The tech is amazing because even if you were on location, you couldn’t control things to that degree,” says The Mandalorian and Obi-Wan Kenobi director Deborah Chow, the first woman director in Star Wars franchise history. Cinematographers love the continuity and the quality of light the screens and practical lights emit. And actors draw inspiration from their immersive surroundings without ever leaving the soundstage.

Ian Milham

Ian Milham, StageCraft’s virtual production supervisor, who joined ILM in the summer of 2018, spent much of his early career steeped in real-time, game-engine-driven environments. A video game developer and matte painter by training, he earned BAFTA and Visual Effects Society nominations for his art direction of the seminal survival game series Dead Space and was an artist for two Star Wars games in the early aughts. “I spent a long time in the world of real-time environments,” he says. “A lot of what you’re doing on-set is having to sort of translate and navigate a 3D world that is actually presenting in two dimensions. When you’re standing on-set, people tend to think of this giant thing as a big screen. But of course, it’s more like a portal or a window to a full-3D world.”

Landing at ILM right before The Mandalorian came out was “pretty amazing timing,” he adds. “People were really excited, not just about the show itself but how it was done. And then production paused on pretty much everything due to the COVID lockdown, so many people had to improvise. So we reached out to various productions and started to share how this technology might help them.”

Milham and his colleagues still had to convince them that StageCraft wasn’t just reserved for epic productions in the Star Wars and Marvel universes. “That’s the beauty of a system like this,” he says. “You can do a bunch of different shows on a bunch of different levels this way. If a show couldn’t fly to New York to finish a movie on location because the city was on lockdown, suddenly there was a way to do it safely in Los Angeles or San Francisco.”

Driving that 3D world with ILM’s proprietary game engine, an evolution of Epic Games’ Unreal Engine used in Season 1 of The Mandalorian, Milham helps directors and DPs see what they need to see to make decisions swiftly in the moment. “What’s been most helpful to me is to be able to translate what the DP or director wants by mentally navigating for them in three-dimensional space. When the DP says, ‘Hey, can you move that left?’ or ‘Can we do that?’ Well, sometimes you don’t actually want to move it left. It just needs to feel like it’s being moved left.”

The conversation on-set still goes both ways. “All the questions, ideas and requests from directors, DPs and production designers really helped us figure out how to do things on the LED,” he says. “After Season 1 of The Mandalorian, we wrote a bunch of our own tools as well as our own renderer for them, mostly based on filmmaker requests of how it could better align with traditional production workflows and also how it could be more robust and more filmic. Today, it is much, much more powerful and completely rock-stable.”

With many more shows under its belt, ILM is now putting all that collaborative experience in the volume into practice. “If you look at more recent StageCraft work we’ve done, there are, of course, lots of technological advances,” says Milham. “But what you are really starting to see are a lot more creative visual motifs and the more seamless way we’ve learned to use StageCraft. That’s more ambitious, but now we’re in concert with DPs, art directors, production designers and everybody else on-set. It’s not just a background anymore.”

Though many still find the volume daunting at first, most creatives and crew learn to love it very quickly. “There’s still that immediate, natural reaction for a DP or production designer who learned to do things in a certain way to arrive in the volume and say, ‘Oh my God, what is this? How the heck am I going to learn to work in a whole new way?’ They think they have to learn a bunch of new technical skills and make a bunch of compromises,” he says. “What I try to tell them is, don’t do those things that are going to force a lot of compromises. Instead, think of all the compromises you don’t have to make by doing some part of your show this way. Use it for what it’s good at. Then it’s suddenly about possibilities and preserving your creative vision, and that’s exciting.”

Once filmmakers are open to virtual production’s new ideas, Milham says the sky’s the limit. “When I’m met with any skepticism or pushback, I always say, ‘Think of the things that you didn’t think would even be possible on location, like literally breaking with reality. StageCraft is just another creative expression tool that lets you manipulate the physics of light and do other really cool stuff that you never could do on location or on a greenscreen stage.’”

Mandalorian

Footage shot in real locations can also be brought back into the volume and manipulated while filming a scene, even complex shots, like time lapse. “Normally after shooting your time lapse on location, you would be tied because you’re shooting that time lapse along with your foreground elements and everything else at the same time. But you could shoot that time lapse, play it on a stage of that location — let’s say a sunset or something like that — and then move your actors and the camera at whatever speed you wanted, which doesn’t have to have anything to do with the speed at which you shot that original time lapse. And it’ll all still work.”

Filmmakers can also animate the stage within that world, he says. “If you wanted camera travel and time of day to be separate ideas, you can literally play with the time scales of either in a photorealistic way,” Milham says. “We’ve already done that. And that’s not just to get to off-speed frame rates or things like that. You can layer these mind-bending ideas on top of each other in a way that, if you were actually doing it on location, you would be locked into whatever choices you’re making at that moment. You’re doing the layering in real time. You’re not handing it off to post, where they have to spend so much time layering it together at the back end. We can brighten up a section of the sky on the screen by making a cloud transparent or even by grabbing a whole different section of sky, almost as if you’re in post with Nuke or something. The difference is you’re doing it within the world created on-set.”

Obi-Wan Kenobi

On location during traditional production, if a scene was too bright in natural light, you might add a big silk or scrim to compensate. “We can actually make the clouds overhead denser,” says Milham, “so you’re flagging off with an actual cloud instead of some arbitrary, man-made material you had to add to the scene.”

The best part, he says, is watching the sense of relief flood over the set when the cast and crew realize how much better it is to film in the volume than on a green or bluescreen. “In the volume, they are in the thick of it,” he says. “They no longer have to pay that imagination tax of having to conjure whatever is going on around them. On greenscreen shoots, the director might say, ‘Imagine there’s a boat out there.’ But what kind of boat is it? You would get a lot of people telling you they could picture what was happening. But not all of them were imagining the same thing at the same time, and the eyelines were never right.”

Obi-Wan Kenobi

On a StageCraft set, what you see is what you get. “We can put a boat up on the screen, and now the actors are reacting to the size, the shape, the glamor and gleaming sides as it cuts across the water,” he says. “They all get it instantly, and it shows up in their body language. The director doesn’t have to explain it to everybody, and they don’t have to ask a lot of follow-up questions, either. They’re like, ‘Oh, I see; this is an aspirational boat. Oooo, who is that?’ They’re all on the same page.”

Long-form narrative isn’t the only thing being shot on the five stages StageCraft is running simultaneously in Los Angeles, London and Vancouver. Milham says the stages are also now getting booked often for one-day commercial shoots and TV segments. “It’s so much more scalable than people think,” he says. “End-to-end virtual production is complicated. I like to compare it to aerial photography. We’re still in the helicopter stage. To fly that helicopter, you need expensive, specific equipment and someone who knows when it’s too windy. It’s complicated. It is getting more democratic, for sure, but we aren’t quite at the drone stage yet.”

Obi-Wan Kenobi

In these early days, Milham sees a tendency for hyperbole, to oversell the technology or concept just to become part of a larger trend. “Some folks out there want to jump into the game and are overselling facilities but just don’t have the experience yet to do this well,” he says. “To those who want to start making content virtually, do your homework and work with people who have a record of having pulled this off, because it’s hard! But there’s also no need to be too intimidated by it. It’s an amazing new tool that is great at solving some complex problems in a way that can be a lifesaver. That doesn’t mean it’s the best thing for everything.”

But despite its challenges, virtual production does have one very big upside, Milham says. To those who’ve already experienced it, it’s an undeniably energizing way to work. “It’s more like the kind of moviemaking that we all got in this business to do,” he says. “It’s definitely bringing back so much of that ‘Hey gang: let’s make a movie!’ energy, where we’re suddenly all kids in our backyards again making our creative visions come true.”

Dimension’s Jim Geduldick

Jim Geduldick

Cinematographer Jim Geduldick has always gravitated toward bleeding-edge technology like a moth to a flame. But since becoming SVP of virtual production at Dimension North America, the Los Angeles branch of London’s Dimension Studio, he’s become an expert on all things virtual.

Known mostly for its volumetrically captured virtual humans, like the hologram of a golfer that Sky Sports broadcast live from the British Open a few years ago, Dimension is well-versed in creating all kinds of virtual worlds for film, TV, VR, augmented reality and the metaverse.

Disney’s Pinocchio

On the virtual production side of the business, Geduldick has had his hand in most of the studio’s real-time production for the past 18 months, including serving as virtual production supervisor on Robert Zemeckis’ upcoming live-action version of Pinocchio. “We’ve been really, really busy lately with a lot of tentpole stuff,” he says. While he can’t say much about the Zemeckis film, he can say that the techniques they used to capture and animate the action are game-changing.

“As a whole, I think every project we do has pioneered something new,” says Geduldick, who also has been deep-diving into all things AI in his spare time. “I know for a fact Pinocchio did. That’s the goal every time: We’re trying to push the envelope. On every set, we have to ask ourselves how to refine our process, take our lessons learned and apply them? After every project we do an internal debrief and ask ourselves, ‘What can we change? What can we make better? What worked, and what didn’t work?’ Then we build upon that.”

Where does he see virtual filmmaking headed next? “Now that the industry has obviously taken notice, virtual production is accelerating so fast,” he says. “We’ve moved way past that moment when The Mandalorian made us sit up and pay attention. Now all kinds of storytellers beyond film and TV, including non-entertainment-related corporate brands, are looking seriously at real-time technologies. It’s so much more than just LED walls.”

Once you move beyond the term’s narrow definition, you can start to redefine virtual production in ways that suit your project’s needs, he adds. “Virtual production could encapsulate volumetric production, or it could encapsulate mocap. From our perspective at Dimension, it’s really a mashup of film, games and broadcast technology. You can still do virtual production without an LED wall and as a component of a greenscreen shoot.”

Geduldick says Epic Games, the creator of Unreal Engine, has “stepped up like no one else has in terms of support, education and just putting the tools into the hands of the right creatives. It’s still the most popular game engine and the fullest-featured.” Dimension was one of Epic’s early partners, and Geduldick has been an Unreal Fellow.

The Virtual Production Innovation Project

“The terminology is confusing for those that don’t come from visual effects or games or from a CG background, so education is key at every step of the process,” he says. “As a virtual production supervisor, which is part VFX and part DP, I feel like I’m always playing the role of educator to help break down the technical terms for people and get them back to work. I’m always asking myself, ‘How can I help another DP or VFX supe or production designer or writer or director?’”

The first step, he says, is to make the tools as transparent as possible. “We need to get out of our own way and let the tools get out of our way, and the only way we can do this is with a full team on-set. It doesn’t get done without teams. That’s the real benefit of these real-time productions as a whole: They open up so many more doors for collaboration and exploration. Creatively, it opens a million doors…infinite doors, really. That’s so cool, but to some people, it can also be overwhelming.”

Virtual production can also be different things to different people, depending on your part of the production pipeline. “As a DP, I look at all the different things real-time technology can help me do,” he says. “For example, I can jump into the engine and set up my own cameras and my own lighting, and I can basically visualize my storyboards in real time. As an editor, you could use the game engine to string out a really nice 3D animatic really quickly. If the editor needs to reverse the same shot, I can just rotate the entire world, render out another camera or another perspective, and add a bunch of things in there. In this way, the game engine or whatever real-time engine you’re using becomes the hub that you plug all your different projects into.”

But how do you draw in traditionally trained filmmakers who still haven’t used it? “There are DPs that I work with, and even some visual effects folks, who have yet to work on an LED volume,” Geduldick says. “For those who are adventurous, whether they are DPs, production designers, producers or whatever, my goal is always just to get them over to the stage and show them what an LED volume really looks like. The quickest way to sell people on an idea is to show them something visually.”

Dimension

Budgets remain a barrier to entry, however. “For the foreseeable future, there will still be a large cost associated with virtual production because if you’re shooting on an LED volume, if you’re building one, your build costs are definitely up there,” he says. “Do you need to shoot a guy in a reflective helmet like in The Mandalorian? If not, then you don’t need an LED ceiling like ILM’s $250 million StageCraft. You could still use practical lighting, load up your SkyPanels or videos, whatever your lighting is.” The costs can then scale down significantly, he says, from project to project.

“These days, we’ve got previz rendering at a level that is almost 90% of a finished shot by the time it even gets to an LED volume,” he says. “That’s the gateway to these tools. We will never get to the point where we can push a one-button renderer and have it look as good as The Mandalorian or Pinocchio or Star Trek. But with a seasoned team — and in this business, you live or die by the quality of your team —you’re all working together to get the highest render out there.”


Beth Marchant writes about entertainment technology and craft for The Los Angeles Times, IndieWire and other industry outlets. Through her content consultancy HudsonLine Media + Marketing, she is a ghostwriter for Fortune 500 companies including FedEx and Verizon. Follow her on Twitter and Instagram @bethmarchant.

 

 

 

 

 

 

 

 

 


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.