NBCUni 9.5.23
David Stump

DPs and VFX: David Stump and Ollie Rankin

By Beth Marchant

Anyone working in virtual production today is, by default, a problem-solver. Adept at pushing boundaries and fluent in an inexhaustible set of technology and techniques, these on-set magicians are also likely to see possibilities from behind the lens that most had never considered before.

To many, however, virtual production and virtual and mixed reality are still the Wild West, a place where only those with the biggest budgets and the most technical savvy dare to go. But that’s not how the visionary developers who are mapping out its side streets and real estate see it. They are turning the digital desert into thriving business districts that everyone can visit, a place where most of the creative work and otherworldly effects can now be done in preproduction instead of in post.

How are they making it happen? We spoke with two early adopters of virtual techniques — David Stump, ASC, and Pansensory Interactive’s Ollie Rankin — about how they’ve leveraged their wide sets of skills and previous rock star visual effects experience on films like the X-Men and The Lord of the Rings franchises to push visual storytelling in bold new directions at every scale.

David Stump

L-R: David Stump and director Neil Gaiman

David Stump, ASC: A Virtual Production Mix Master
Virtual production is hardly a new concept to cinematographer David Stump. A technologist, visual effects supervisor and DP, Stump began experimenting with real-time visual effects and camera tracking long before LED screens ever appeared on-set. In the early aughts, he shot with a zero-gravity virtual crane alongside visual effects pioneer Doug Trumbull (2001: A Space Odyssey, Silent Running, Close Encounters of the Third Kind). In 2003, he made an all-virtual film with Grease director Randal Kleiser. “We were generating Maya backgrounds and driving them with camera data from encoders through a Cooper system,” he recalls. “I’ve used virtual assistants on and off throughout my career.”

Stump has also worn many hats during that career, sometimes on a single show. While his many credits run long and deep (Beetlejuice, Mars Attacks! X-Men, X2: X-Men United), he resists categorizing. “Hollywood dearly loves a pigeonhole,” he admits. “I guess I’m a little bit of an enigma because I discovered in the middle of my career that if I didn’t tell anybody what I did, I was free to do anything I wanted.”

David StumpOn some shows, he’s literally done everything except direct. “There was one movie I did three years ago where I helped the director to write the script. Then I shot the movie. And as I shot it, I shot all the visual effects work.” He also supervised the compositing, conformed the film, then moved into the color suite to finish it. “I even did the sound, then married them up and delivered the DCP.”

All this makes him an indispensable asset on any virtual set or volume, still very much terra incognita for the vast majority of the industry. “Virtual production is simply enabling technology,” he says. “But it took forever for people to realize that. The metaphor that I like to use is the walkie talkie. Even the lowliest PA on-set now has a headset, but before this handy military technology trickled down to the film industry, it was nearly impossible to cue anybody without a lot of extra effort. Think about it: On Lawrence of Arabia, [David] Lean’s team had to resort to semaphore signals from mountaintops to cue and film those battle scenes!”

American Gods

Most of his recent work has involved the artful manipulation of reality to create an on-set environment that just wouldn’t be possible on location. For a sequence he supervised on American Gods, the Starz television series based on the Neil Gaiman novel, Stump combined virtual assistants, greenscreen and practical lighting to suspend the rising sun so cast and crew could bask in the dawn glow for the entirety of the scene. “Adam Kane, who was directing that episode, wanted to do an entire four-minute sequence with the sun just starting to peek over the horizon,” he says. “To do that on location, with the real sun exactly on the horizon for that long, was physically never going to happen.”

Stump used a Lightcraft Technology Previzion virtual studio system that blends real-time camera tracking, keying and compositing to achieve what Kane (Star Trek: Discovery) envisioned. The proto VP system, developed by Eliot Mack and Phil Mass and sourced from Stargate Studios’ Sam Nicholson, helped bring more sophisticated effects to the small screen on shows like Once Upon a Time and Pan Am.

David Stump

American Gods

For the upcoming sci-fi epic Gods of Mars, Stump is working the way most who create films in virtual production do these days: inside a wall-to-wall LED volume on a soundstage. With start-to-finish virtual production driven by Epic Games’ Unreal Engine, the goal, however, is to render the film’s tentpole-worthy effects well under the stratospheric budgets previously needed to produce them. “Gods of Mars is pretty much a basic game engine-to-LED screen show because we’re generating backgrounds almost exclusively in the engine,” says Stump, who is shooting plates in 12K with an array of Blackmagic Ursa Mini cameras.

The workflow he’s most excited about, though, is one that has never been done before: shooting in wide-screen anamorphic onto LED screens inside a massive water tank.

The upcoming World War II action drama is certainly worthy of the sweeping, wide-screen treatment. Il Comandante (The Commander), an Italian/French co-production directed by Edoardo De Angelis, tells the true and unlikely story of an Italian submarine captain who risks his own life and that of his crew to rescue the 26 survivors of a Belgian merchant ship that they have just sunk during battle. The commander became an Italian national hero after ferrying the men to safety on the deck of his submarine.

The Commander previz

Even though the movie is still in preproduction, buzz began to build late last year when its workflow was discussed at the Real-Time Conference. “In technical circles it’s now simply referred to as ‘the Comandante workflow,’” Stump says. He and VFX designer Kevin Tod Haug are wrangling the effects for the entire film. “It’s a pretty complex project. We have a pretty large Italian production team (it will shoot in either Toronto or Rome), and several other VFX teams are spread out all over the world.”

After realizing the cold waters of the North Atlantic presented too many challenges to shoot the film practically on location, the filmmakers considered their soundstage options. “We had to figure out how we were going to do the deck of a submarine at night in open water,” Stump says. “That’s what led us down the path to an LED wall. Then we had to figure out how to track everything, especially with anamorphic lenses. And once we figured that out, we also ironed out all of the problems of reading lens metadata in real time from anamorphic lenses like Cooke and Zeiss.”

The two cardinal points inside the anamorphic lens, one horizontal and the other vertical, Stump explains, are what really complicate things in a virtual setup. “Those two raytrace crosses, as opposed to the single point in the middle of a standard lens, just throw a spanner in the works,” he says. “With a standard lens, you simply center the camera on that point, and everything stays where it needs to be from shot to shot when you pan and tilt, avoiding parallax errors.”

The Commander previz

With help from engineers at Cooke in Leicester, England, Stump and Haug could help usher in a new era in photoreal visual effects on-set. “No one has worked out yet how to use lens distortion and lens shading in CGI compositing, so the metadata out of these lenses is kind of the holy grail of visual effects,” he says.

With an expanded version of Nuke, Blackmagic 12K cameras and a lower-res LED wall that can withstand splashes of water, their near real-time workflow is both powerful and flexible. “We’re [putting] what we shoot on the LED and acquiring proxies,” he says. “Then, in near-real time near the set, we are going to hand off those proxies, along with the data from the game engine and the camera tracking system, and flow them through Nuke.” The Foundry is writing an AI application that can distinguish between what is on the LED wall and what is in the foreground. “Once you can do this, you can take what’s on the wall, which is lighting your scene and your actors and giving them beautiful backlight and interactive light, from what’s happening with all the bombs bursting in. And you can replace that in near-real time in Nuke, using their AI algorithm, and composite it directly to the output of the game engine or the photographic plate that you used.”

That’s not all metadata can do. During crucial battle scenes, even the lights will be perfectly in sync. “If there’s machine gunfire and tracer rounds going off, and there’s an explosion up there, we can take that and not just put it up on the screen, but we can DMX that to lights that actually augment the effect on stage. DMX is just metadata over Wi-Fi, after all.”

The Commander previz

Stump has waited a long time for the day when these kinds of workflows would be taken seriously on-set. “I’ve been trying to push that rock up the hill for over 20 years,” he says. “Even as recently as three years ago, most camera people would simply shrug and say, ‘‘What do we need data for?’ But thanks to the explosion of virtual production and on-set effects driven by game engines, metadata went from zero to hero, becoming the most important thing on everybody’s radar. You can’t synchronize the LED wall and synchronize your movements in the virtual world without camera metadata. You’ve got to output data from the camera and from a tracking system to drive the engine.”

Although the hurdles in Il Comandante’s virtual workflow are extreme and numerous, that also inspires him. “I’m really looking forward to working on this show the most,” Stump says. “It’s going to be such a juggling act to make everything work: getting the data from the anamorphic lenses, tracking the camera and the camera system, and doing it all across a submarine deck floating in a water tank. Plus, we have to balance between shooting it against an LED wall and shooting it against a bluescreen or greenscreen.”

It really is the best of both worlds, he concludes. “This workflow proves that an LED wall doesn’t necessarily have to be final pixels. Even if you’ve got what you want on the LED wall, you can still go back and noodle it in near-real time. With this workflow, if you’re sitting there with a director, and he’s watching it and saying, ‘I’m not really getting what I want right here or over there,’ then you can just hand it to your Nuke compositor sitting next to the brain bar, and he can start churning out the near-real-time AI recomposite.”

Pansensory’s Ollie Rankin: Can VR Change the World?
Ollie Rankin’s introduction to virtual filmmaking began long ago and far, far away, where few have only dreamed of going: the pathbreaking creative studios of Peter Jackson’s Weta Digital.

Ollie Rankin (glasses and beard) on the set of Downloaded

Rankin, a director of virtual reality who is also a political poet, novelist, musical comedian and dedicated humanist, has amassed his share of knock-out visual effects credits ever since, including The Lord of the Rings trilogy, a Matrix sequel, a Harry Potter film and Alice Through the Looking Glass. But VR and mixed reality production are where his heart is. Now CEO and creative director of the VR/mixed reality studio Pansensory in Vancouver, he’s creating content that reflects all the many facets of his “long and winding” story.

In his native New Zealand, Rankin first taught himself — pre-internet — how to program on an Atari. At university in the mid-1980s, he discovered artificial intelligence, then an emerging field of research. “Nobody in the wider world had heard of it yet,” he says, “but I got really, really engrossed in it. I knew then it had the potential to help figure out better ways for computers to do things for us and would also let us study our own intelligence through computerized simulations.”

David Stump

Orthogonal

When Jackson revealed he was working on The Lord of the Rings trilogy, Rankin had an epiphany. “‘Well, $#*#,’ I thought. ‘I finally know what I want to be when I grow up!’” When he interviewed for the job at Weta, he didn’t even bring up his experience in AI. “Why would I? At that time, that didn’t have anything to do with making movies.”

Little did he know that someone on Peter Jackson’s team had already urged the director to build his own AI system to expedite crowd and battle scenes. “Once The Lord of the Rings got greenlit, this new system for giving pseudo-artificial intelligence to orcs and elves and pitting them against each other in these giant, simulated battles needed to be built,” he says.

Weta put Rankin on the case. “I started on the same day as principal photography and spent three years co-inventing the workflows that build these brains for orcs and elves and that choreograph the trilogy’s massive battle scenes.”

Orthogonal

There was also a little bit of virtual production happening way back then, which won’t surprise anyone who has worked with a visionary like Jackson. “This is long before CG became the hammer that makes everything else look like a nail,” Rankin says. “People hadn’t yet realized they could do all of the backgrounds in CG, so they were still building these gigantic miniature sets that Jackson could fly through with a couple of lipstick cameras. They were seriously big camera moves through these little, tiny miniatures.”

As motion capture technology caught on, the team used it to further choreograph the film’s imaginary characters. “We did a lot of motion capture for all of the different movement styles of these orcs and elves, as well as for other creatures,” he says. “When we realized you could attach motion capture trackers to a device that would act just like a camera and fly through the CG environments, we were essentially doing virtual production-style camera moves.”

Orthogonal

He’s been waiting to direct VR ever since he first encountered “SimStim” in a William Gibson novel and saw Tron on the big screen. “Even as I worked my way up in visual effects, I was working hand in hand with the directors and editors in my supervision roles, and that trained me as a director,” he says. In between film jobs, he pursued his own projects, which included fine art photography, developing an iPhone app, writing a novel and launching a musical comedy career. “But when the Oculus Kickstarter happened, I realized that VR, which I’d been pining for the whole time, was finally going to become a viable medium for storytelling, but it wasn’t quite ready yet.”

His app, now obsolete with the advent of hand controllers, let you use your smartphone to control what you saw while wearing a headset. “I learned a lot through that, and it set me up for my next job, which was volumetric performance capture, developing technology for basically filming people as three-dimensional holograms.” Those proof-of-concept short films illustrating the technology also officially launched the latest stage of his career.

Downloaded

His idea for an interactive VR film, Downloaded, shown as a teaser to a roundtable at Siggraph, was another hit, premiering the next year at the Venice Biennale and on the festival circuit. While the hybrid live-action and VR production used elements of virtual production and the Unity game engine, it was inspired directly by the virtual worlds in Tron and those Gibson stories Rankin was obsessed with as a kid.

“The premise is quite like Tron in that there’s a technology, developed by a crazy community of hackers, that lets you digitize a human consciousness and download it to a computer,” he says. “The experience starts with you building the machine and testing it out. But when it digitizes your consciousness and downloads it to a computer, it vaporizes your body in the process. Now, you’re trapped inside of a computer and are looking out through the screen.”

David Stump

Downloaded

Players must then communicate with the story’s main protagonist to figure out how to return to their bodies and the real world. Rankin says this creates dozens of possible storylines and perspectives. “In VR, once you’ve established the mechanism that you can digitize a human consciousness, then there’s nothing stopping you from emailing yourself to somebody’s phone and then interacting with a whole different kind of story,” he says.

Around the same time, he produced his first live music event in VR that sent a digital twin of a London DJ — motion-captured in front of a live audience — into the raving metaverse. He has high hopes for this kind of VR content, which he thinks could help reduce carbon footprints worldwide. “It was very hard to convince investors that it was worth their money,” he says. “That is, until the pandemic hit. Suddenly, people were like, ‘Now I understand why people might want to go to a rave virtually.’” A project he created on the social VR platform Sansar for Glastonbury Music Festival’s VR zone, Lost Horizon, and for a music festival in Australia, soon followed.

Lost Horizon

That same year, Rankin was invited by Epic Games to become an Unreal virtual production fellow. “I already had a bit of familiarity with game engines from the VR stuff and from game development,” he says. “But I wasn’t at all familiar with the Unreal Engine.” His short film Orthogonal was the result of his six intense weeks learning how to use it. “Unreal’s visual quality out of the box far exceeds what you can do in most of the competing platforms.”

Pansensory’s team grows and retracts from project to project. Ten worked with him on Lost Horizon and 20 on VR Jam. “With both VR Jam and Lost Horizon, we are live entertainment first, social media second. But gamification is also a big part of what we’re starting to offer,” he adds, referring to hidden easter eggs that unlock free t-shirts and hidden passageways.

What inspires him most about the rise in virtual filmmaking? “It’s really great that traditional filmmakers are finally starting to recognize the benefits of virtual production, and a lot of creative technologists are making the interfaces that mimic the way films have always been made. That eases the transition for everyone and flattens the learning curve.”

David Stump

Lost Horizon

Two of his upcoming passion projects might also help change minds and affect the kind of transformation he’s always believed could happen with VR. One tells the immersive tale of young people whose lives are already being impacted by the climate crisis. “It uses a combination of volumetric performance capture and photogrammetry to transport you into the homes or towns or locales of these young people as they tell you their story,” he says. “We’re also going to make it available on people’s smartphones and web browsers because it’s such important content that we need to be able to reach everybody, even if they don’t have a VR headset.”

Another follows his sideline passion in political spoken word poetry. The immersive hip hop-era, called Rhymes with the Times, transports five to 10 people into surreal worlds that are art-directed to match the themes of each poem. While in the VR space, they collaborate to solve a puzzle or complete a task.

Lost Horizon

“For me, the power of VR storytelling is all about perspective. It’s the best way to put somebody into somebody else’s shoes and have them see the world through somebody else’s eyes. This is how to break down their biases and assumptions. It’s definitely a democratizing technology, and that’s definitely a force for good. But if that same technology is used to advertise meaningless consumer products or to manipulate people’s political views, it loses that value.”

As long as creators continue to push boundaries and share what they find, he is very optimistic. “It’s amazing how openly everyone in this space is sharing what they’re learning and learning off each other. It’s definitely accelerating our uptake of this stuff and the advancement of technology and technique.”


Beth Marchant writes about entertainment technology and craft for The Los Angeles Times, IndieWire and other industry outlets. Through her content consultancy HudsonLine Media + Marketing, she is a ghostwriter for Fortune 500 companies including FedEx and Verizon. Follow her on Twitter and Instagram @bethmarchant.

 

 

 

 

 


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.