By Karen Moltenbrey
For a while now, there has been reluctance when it comes to a cloud-based visual effects workflow, especially in terms of content security. But when COVID-19 forced studios into a remote work situation, there was little choice – take the leap and trust the security measures or cease operation in the midst of deadlines. No choice, really, if a studio wants to stay in business.
Some, particularly smaller facilities, had their work cut out for them, cobbling together a solid, secure infrastructure to get back to business. Larger VFX houses, like ILM and Stargate Studios, were already using a private cloud that required very little adjustment to accommodate at-home artists.
“When it comes to working from home, everyone has had to try it, at least. From client studios to VFX vendors and post houses, they are realizing it can work and it can be secure,” says ILM’s Francois Chardavoine. “I’ve seen a lot of appetite at all levels to really revisit our approach to working this way. So yes, the shutdown has definitely been a catalyst to consider more of a cloud workflow.”
Here, ILM and Stargate discuss how their cloud setups and workflows enabled them to pivot and extend their setups to an all-remote workforce during the COVID-19 outbreak … and never miss a beat in the process.
Industrial Light & Magic
When studios around the globe had to shut down suddenly due to COVID-19, Industrial Light & Magic (ILM) was a few steps ahead of most facilities since it had already been leveraging a private cloud infrastructure for various workflows. Because of this, the studio was able to move close to 1,000 employees to a work-from-home scenario literally overnight, followed by another 1,000 globally within two weeks after that.
Francois Chardavoine
ILM has not added workstations, per se, in the cloud for artists to use directly, nor has the facility leveraged the setup for cloud computing to render and generate content. Rather, the studio runs services in the cloud. “In particular, we have certain systems around media review and asset sharing. We leverage it mostly for external collaboration rather than for internal workflows,” says Chardavoine, VP of technology for Lucasfilm.
ILM is synonymous with VFX blockbusters. Not only does the studio reign over the billion-dollar Star Wars films, it also plays major roles in other cutting-edge visual effects films, including The Irishman, Terminator: Dark Fate, Avengers: Endgame, Jurassic World: Fallen Kingdom and many others dating back to the early days of VFX blockbusters. Due to the sensitive nature of the content and the strict content security requirements imposed by clients and the MPAA and TPN, the visual effects house needs full control over the ecosystem that it manages since it is responsible and liable for its security.
“We can absolutely guarantee to our clients that our workflows are secure,” says Chardavoine. “Even when we extend into a public cloud, we treat it as an extension of an internal cloud, where we manage the software infrastructure entirely.” And even though all of ILM’s processes are primarily internal, within its private cloud, the studio has the ability to extend them into the public cloud when necessary. “Practically speaking, though, we haven’t needed to do that very often,” Chardavoine adds.
The Irishman
In a private-cloud setup such as this, the studio purchases and owns the equipment, and it’s in a location managed or owned by the studio. Thus, the company can leverage the equipment for other uses. For instance, ILM can run experimental machine-learning algorithms over the weekend, if the infrastructure is available, enabling them to push R&D in a way that would perhaps be too costly on a public cloud network, where users pay according to usage.
ILM has been steadily moving to a private cloud for years. In business for 45 years, the San Francisco location has been growing and evolving digitally over at least the past two decades, while its newer locations (Singapore, Vancouver, London and Sydney) were established from scratch. “Each time you add a new location, you have the opportunity to evaluate what you would do differently and improve on that. So every studio has had a slightly improved or newer version [of the cloud] than we had [at the one prior],” says Chardavoine.
From a security standpoint, Chardavoine refers to the ILM cloud as a “walled garden,” where users can collaborate behind those walls and access many things, but outside those walls, there is no penetration. “Essentially, everyone was at the office, which was inside the walls, where all the machines and the networking takes place,” he explains. “Once you push everyone outside those walls to work from home, you have to adapt your workflows in a way that remains as secure as possible while still allowing everybody to be productive and continue doing work.”
ILM Vancouver
Before the pandemic, ILM had data centers with rack-mounted machines at each location (San Francisco and Vancouver connect to the same one in San Francisco), which are either in the same building as the studio or in a nearby locale. Users either had computers under their desks at the studio or displays connected to workstations in those data centers. When the work-at-home notification occurred, the IT staff had a very small window to pivot.
ILM’s Singapore studio became an early test bed as the virus spread globally. “We wondered, what would we do if we had to turn on a dime and send everyone home? So, we ran a test there involving 30 people across all disciplines and told them to go work from home,” says Chardavoine. “We were fortunate to learn a lot from that experience, and it was much more successful than we had expected.”
Chardavoine attributes that success to changing as little as possible so as not to compromise all the secure workflows that already had been established. “We just wanted to have the employees sitting in a different location and looking at their screens in a different location and not change anything else,” he adds.
This was done using a common approach leveraging a virtual private network, or VPN, which creates an encrypted connection between the home user’s computer (a high-end machine is not required) and those at the ILM offices, which retained the same secure infrastructure that was always there.
Brave New World
“The employees are actually working on their machines that are still under their desks or in the data center at work,” Chardavoine says. “They are working on these remote machines and are streaming the pixels of the display back to where they are at home. It’s not really changing the workflow for anyone other than just having to connect from their home.”
Unlike the high-end hardware at the studio, at home, users only need a basic computer that can handle email, join a videoconferencing call, connect remotely and run antivirus and secure-connection software. The content-creation software, meanwhile, resides within the data center just as it had before the shutdown, as the artists are not working on local content.
When the shutdown occurred, ILM artists were working on approximately 20 projects ranging from the full scope of blockbusters to smaller portions of movies and even theme-park ride films – and they took on about a dozen more projects during this time. Some of the projects delivered during the shutdown include Peacock’s new series Brave New World and features such as Jungle Cruise and Free Guy, to name a few.
“We found that the number of shots we were delivering each week did not decrease at all. There were some hiccups that very first week as everyone was getting used to things, but then everybody hit their stride,” Chardavoine notes.
ILM Vancouver
There are many benefits of working remotely for both the artist and the company, including the ability to hire more diverse talent and to provide a more convenient work schedule. An employee survey found that after several weeks, artists often felt more productive and happier than they would have been otherwise.
Alas, a remote scenario is not for everyone, especially in a creative industry where artists thrive on human interaction. Also, it can have an adverse effect on company culture with a split workforce. And bandwidth problems still arise.
Still, not every function can be done remotely at this time. “We’re never going to be able to get around doing quality control in the conditions in which the media is meant to be viewed, especially for theatrical work that you need to see in high resolution on a big screen with the right Atmos audio — to make sure the lip movement of a CG character is truly aligned and synched with the audio you’d be hearing at the theater,” Chardavoine explains.
Stargate Studios
Stargate Studios began a cloud workflow about two decades ago – years before it became a recognizable industry practice. Thirty years ago, Sam Nicholson, ASC, founded Stargate in Los Angeles, later opening a second site in Vancouver quickly followed by multiple international studios. To facilitate work between the sites, he established a cross-facility transfer — in essence, setting up a private cloud network. Expansion continued as he built the studio into an international virtual production company with “brick and mortar” locations in seven locales around the world, in addition to four virtual sites.
Sam Nicholson
Prior to COVID-19, only about half the staff in the LA studio had been working on location. Then the virus hit and forced his employees into a remote work situation. “The VFX industry has proven to be remarkably resilient in the face of this crisis; we all went home and logged in to our computers, and it was business as usual,” Nicholson says.
While some facilities turn to public cloud services, especially for overflow situations when burst rendering is needed, such an option is expensive. Instead, Stargate uses its own machines across its various sites, configured in a private cloud, to render locally. “All the locations are connected together and trading horsepower, artists and software; we float licenses across the facilities. We just send instructions out and then render locally,” he explains. “We can combine all the horsepower of five or six locations into essentially one private cloud. That said, jobs will overflow sometimes, and then we turn to Amazon and use AWS to get, say, another 1,000 machines online for a week or however long we need. That kind of business model is much more efficient, and it’s much more flexible.”
The private cloud not only supports cross-facility rendering, but as Nicholson points out, cross-facility software and cross-facility artist availability, as the manpower itself and the tools and hardware are spread out. This eliminates the situation of having some artists in a location without any work while others struggle to finish jobs. It also provides Stargate with a presence in various local markets and enables cross-facility education, what Nicholson calls the “Stargate Exchange.”
Stargate using their ThruView system to playback a following train car in-camera for Run on set.
While Stargate had been enjoying the benefits of this private cloud network for quite some time, when COVID-19 hit and forced a remote work situation, the studio didn’t miss a step. “I told everybody to take a machine (each artist has three on average), go home, and we’d VPN all the computers together,” says Nicholson. So now, instead of having six basic locations — or even 10 — that employees are logging in from, we are now online with 200 individuals logging into our primary facilities in North America, Latin America, Europe and Asia.
The process to make that happen was practically seamless. As it turns out, Stargate had been VPN’ing artists into the network for some time, particularly matte painters, who typically have longer lead times and whose work involves much smaller files. “It’s so easy to remote in a matte painter and relatively easy to remote in 3D [artists]. It’s somewhat complex to remote 2D compositing, though,” says Nicholson. “But, if you start with matte painters, you get used to it, and then you get used to 3D. Eventually you say, ‘We can do this with all our 2D artists.’”
Shooting in-camera VFX for Run’s train scenes.
In fact, Nicholson believes it was a natural progression to remote in the staff; COVID-19 just made it happen sooner. Overnight, actually. “No one went into the facility. The power was left on and the lights are off, but all the machines stay active globally 24/7,” he says. “All the horsepower stays online, and people just remote in. It turns out to be, in some ways, more successful; I am able to communicate with my artists online from my home studio, and I can reach anyone any time of the day. It’s faster than walking down the hall and trying to find someone. And you can have three, four, five conversations going at once, which is a lot more difficult to do in person because you can get pigeon-holed into an office or a discussion. Working remotely, there are fewer distractions. Our artists, producers and staff can manage their time more effectively, which translates to being happier and more productive.”
HBO’s Run
Stargate uses industry-leading software, most of which is cloned across the locations, because the software configuration across the network has to be identical for the setup to work. “I can’t send you a file if you don’t have the software to run it,” Nicholson points out, adding that the machines at each location auto-reboot and update nightly.
Although the company uses Amazon AWS only as needed for burst rendering, it is employing that public cloud for data transfer and storage. In Los Angeles, Stargate keeps about 3PB of storage online to handle its extensive digital library of virtual locations, what it calls the Stargate Virtual Backlot.
“The concept of virtual production is something we’ve been working on for many years with virtualized locations,” says Nicholson. “We extensively cover a location with multiple cameras, wide shots, medium shots, close-ups, LIDAR and photogrammetry. This requires a tremendous amount of data, since you are shooting 360 with as many as eight cameras at once in 4K or 8K. You can spin up a lot of data quickly. For a typical shoot now, we come back with 200TB of data. We need to be able to store all that material and access it globally, and the best way to do that is the AWS cloud.”
As Nicholson points out, Stargate’s success lies within its ability to reinvent itself, which it has done a number of times, stepping into the digital realm from film, and now stepping from digital into the virtual realm, where they are using the cloud and virtual sets and embracing cutting-edge technologies, including its new ThruView process, which eliminates the need for greenscreen.
Helping The Resident go zero gravity.
It all starts with a stable, consistent workflow, which Stargate found with television productions. When the virus hit, Stargate LA and Toronto were in the middle of delivery on a series with HBO called Run and needed to deliver 10 episodes (3,000 ThruView shots) as well as hundreds of traditional 2D and 3D visual effects. According to Nicholson, HBO had called and asked if the work-at-home order would affect the delivery date, to which he responded, “No way, we’re going to hit all our deadlines.” And they did.
During the shutdown, Stargate Atlanta and Stargate Malta were working on other projects, including The Resident medical drama for Fox, which they were able to finish and deliver on time. Now, the delivery schedule has slowed somewhat, giving the company time to accelerate its R&D and tune portions of its pipeline to get ready for what Nicholson believes is going to be a tremendous upswing when production begins again in earnest.
Karen Moltenbrey is a veteran writer covering visual effects and post production.