Tag Archives: ILM

Indiana Jones

VFX Supervisors Talk De-Aging Indiana Jones and More

By Iain Blair

Directed by James Mangold, Indiana Jones and the Dial of Destiny is the long-awaited final chapter in the beloved saga, which stars Harrison Ford reprising his role as the whip-smart archaeologist one last time.

Andrew Whitehurst

Featuring huge set pieces showcasing spectacular VFX – including the opening train sequence, the tuktuk chase sequence, the horseback chase through the New York ticker tape parade, an underwater dive in Greece and the grand finale — the globe-trotting adventure was filmed on location in Morocco, Sicily, Scotland and England in addition to stages at Pinewood Studios.

I spoke with the film’s overall VFX supervisor, Oscar winner Andrew Whitehurst, and ILM VFX super Robert Weaver about the challenges of creating the VFX and working with Mangold (Ford v Ferrari, Walk the Line).

What were the big challenges of creating the VFX for this?
Andrew Whitehurst: The biggest challenge of any film like this, which is a journey and full of locations, is that almost every single scene is a challenge in itself. We’re either in a different place or era or time of day, or we’re following different characters, so it’s not a movie set in one place where you can refine and tweak the look of that place. We have a whole swathe of very different kinds of work and environments that we had to create. So it was the huge scope and scale across the film that was the most challenging aspect of doing all the VFX.

How many VFX shots are there, and who did what? Break it down for us.
Whitehurst: We did roughly 2,350 VFX shots. It’s a lot, but then there’s a lot going on. We had a lot of vendors, and they all did little bits here and there, but in terms of the major work, ILM did the opening act with the whole Nazi train sequence as well as the third act.

Rising Sun did all the New York work, including the streets, the airport, the rooftops and the subway sequences. Soho did the tuk tuk chase sequence set in Morocco, and ILP (Important Looking Pirates) did the wreck dive set in Greece, along with a lot of sea plate extensions and the creepie-crawlies in the tomb. We also had an in-house team that did a lot of window extension work… putting stuff outside apartment windows, that sort of thing.

Similarly, a lot of our other vendors, including The Yard, MidasVFX, Capital T and Firestorm, also did that kind of work — changing backgrounds out the windows.

Of all the huge set pieces, what was the toughest to deal with and why?
Whitehurst: They were all equally challenging and all vast in scope and complexity. For instance, we couldn’t shoot the 1969 astronaut parade sequence in Manhattan, and so much has changed with the buildings, so we shot in Glasgow, which does a reasonable impersonation of 1969 New York. The art department did an amazing job. We had over 1,000 extras, cars and the horse, and we extended the streets up higher and out longer and added all the confetti with VFX.

The challenge on the dive sequence was that we shot some underwater stuff in the Mediterranean and the rest in a tank at Pinewood. We also shot some of it dry for wet because we needed that control for performance issues.

Did you 3D-scan all the locations and sets?
Whitehurst: Yes, we scanned and photographed absolutely everything, including places we didn’t actually film at but thought would be useful for generating material for backgrounds. That was crucial. Clear Angle did it all.

DP Phedon Papamichael, ACE, told me that integrating all the VFX with your team was crucial, and he had you on-set in the DIT tent?
Whitehurst: Yes, I was basically on-set every day and in the tent with his DIT, Ben Appleton, for the whole shoot. And Robert was with us for some of the more specific stuff that ILM was doing.

Robert Weaver: I was there for the whole opening sequence with the train, for all the stage work. I was there for about a month in the DIT tent while also working with the various grips and The Blues Brothers trying to get bluescreen coverage where we needed it, dealing with changing camera angles and so on.

Robert Weaver

Robert, tell us about the complex process involved in de-aging Harrison Ford from his late 70s to his late 30s. Was delivering a convincingly youthful Indiana Jones the most challenging job for the team at ILM?
Weaver: I think it was because it’s the whole opening part of the film and if it didn’t work, you’d probably lose the audience. We use a whole bunch of components that go into doing a face swap, basically replacing one face with another. We have our proprietary face replacement technology, ILM FaceSwap, which used every nuance and detail of Harrison’s performance on-set. It all boils down to having a good repository of imagery to work from, either from reference or for helping to process through machine learning.

But more importantly, it’s down to the artistic skills and abilities of individuals pulling key components to help create the final overall image we needed. It also involved building a complete CG asset of Harrison and then a younger version of that as well. So it’s a multi-faceted process that combines the latest technology with archival imagery, and it took a lot of R&D. We had to figure out new ways of exactly how to do it because it’s an ever-evolving process. In terms of the de-aging result, I think we’ve achieved the best that’s been done anywhere to date.

Indiana Jones

Whitehurst: One of the key things to understand is that FaceSwap is an umbrella term for a lot of different technologies, whether it’s using reference photography, machine learning or multi-camera shooting on the day and then being able to extract 3D geometry and then remapping that onto a different face.

So there’s a lot of different approaches and techniques that we can use, and we did use every single one of them. Each shot would end up using more of one technique than another; there was no one process or method that worked for everything. It was shot by shot, and it all comes down to performance, and Harrison’s driving it.

Weaver: And Harrison is so fit that he was able to do pretty much everything he was asked to do, which was invaluable for us. It’s pretty amazing for a guy who is now in his 80s.

Indiana Jones You’ve both worked on a lot of huge projects. Where does this rate in terms of complexity and challenges?
Whitehurst: It’s the biggest movie I’ve ever done. It’s certainly the longest I’ve ever been on one project. I’ve been working on it for three years, from early prep all the way to the end.

I get to look back on the earliest conversations we had about what we might try and do with all the VFX, and then doing all the early previz for the opening sequence and how it might work. To see all that go through the shoot and then post, and layering in all the VFX and to see how it all evolved in the edit and how we tweaked stuff… it was very creatively satisfying.

Weaver: It’s the same for me. I’d say it’s top of the list of all the films I’ve ever worked on. It was incredibly challenging but so rewarding to work through it and see the results.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Black Panther: Wakanda Forever’s Oscar-Nominated Visual Effects

By Iain Blair

Black Panther: Wakanda Forever, the follow-up to 2018’s blockbuster Black Panther, was nominated for five Oscars this year, including one for Best Visual Effects. Once again directed by co-writer Ryan Coogler, the action-adventure-suspense-thriller Wakanda Forever features a cast of thousands – and behind the scenes, another army of VFX artists and technicians from a raft of companies including Digital Domain, Weta, ILM and Cinesite, all overseen by VFX supervisor Geoff Baumann.

Wakanda

Geoffrey Baumann

I recently spoke with Baumann — who was nominated for his work along with Craig Hammack, R. Christopher White and Dan Sudick — about creating all the VFX and the production pipeline. Additional VFX supervisor Michael Ralla and Weta FX VFX supervisor White joined the conversation.

What were the big challenges of creating so many VFX?
Geoff Baumann: The biggest one was the world-building of the underwater city of Talokan. That encompassed the world itself and defining who these people are, but the fact that it’s underwater created a whole other set of problems, both from a VFX standpoint and a practical one. The feedback loop of communication between the talent underwater and Ryan and everyone involved was also a huge challenge.

Of course, Wakanda itself was also a huge challenge because of its sheer scale. But even though it was so complex in terms of the CG work, we’d already established it in the first film, so it was a known entity.

Wakanda

Michael Ralla

I assume you worked very closely with DP Autumn Durald?
Baumann: Yes, very. We had to really embrace her shooting style and choice of lighting and lenses in order to make our CG work match the practical. We did a lot of previz to that end, and our previz supervisor at Digital Domain, Scott Meadows, and Autumn sat down together so we could render shots in the same way she would shoot them. That was also a big challenge, as we did some previz early on before she was involved, and it turned out to be angles that she wouldn’t shoot.

She generally shoots and frames shots very low and always backlit, regardless of continuity. The camera’s usually tilted up so you see more ceilings and skies than ground, and there’s a lot of negative fill. So we embraced all that, and previz did as well. But I’d say that previz wasn’t as much of a bible on this as it usually is on other big, complex VFX films like this. There was definitely a lot of room for Ryan and Autumn to explore any ideas on the day of the shoot, and there were days when we didn’t shoot the previz at all. But I’d say it really helped them.

Michael Ralla: This film looks very different from anything else in the MCU. That approach was clear right from the start, and one we both really embraced. Otherwise it would have been impossible to finish the film.

Geoff, how did you and Michael work together as two VXF supers?
Baumann: This project was huge. We had over 2,400 VFX shots, and then our whole post schedule was extremely short. On top of that, we had to deal with losing Chadwick, and then injuries and COVID. In general, Marvel tends to hire additional VFX supervisors to help support the process and oversee the second unit coverage, and then they now stay on through all of post.

Eight years or so ago, the additional VFX supervisor would come on for the shoot and then leave, but we discovered that there was still so much VFX work to be done with thousands of shots that it just didn’t make sense to let that person go. So Michael came on in prep and helped prep all of the second unit work. He was responsible for all that and all the specialty rigs and dealing with the crew. Then he oversaw all that material in post, the vendors, and made sure it was executed the way it was intended.

Michael, what were the key sequences you were in charge of?
Ralla: The first was the big Boston chase sequence in and around MIT. Then pretty much everything that happened on the bridge, even though there was a bit of a split from the main unit. Then there was the big opening sequence called “the mining mission,” which was also all shot on-location outside Atlanta. Both of those sequences were fairly stunt-heavy, with more technical complexity because of all the action, and more prep was needed. Then there were various bits and pieces sprinkled throughout the shoot, and Geoff directed 30 days of underwater shooting as well.

Geoff and I go way back to Digital Domain, where we worked together years ago, and the chance to reteam with him and work with Autumn was a big part of why I got involved in this.

Geoff, how early on did you start working on the big underwater sequence and VFX?
Baumann: Ryan decided pretty early on that he wanted to shoot as much of it practically as possible – wet for wet — so there was a big drive to do that, and we shot every underwater sequence anamorphic in a big tank first. So even though we had to do a lot of extensions to create the underwater world, we had that real-life interaction of characters and water. Then we did dry for wet and were able to replicate a lot of the wet-for-wet movements and setups exactly. That allowed Ryan and Autumn to make any adjustments they wanted.

How was the work divided up between all the vendors?
Baumann: Digital Domain did about 450 shots, and Cinesite in London and Montreal did about 300 shots. Weta did nearly all the underwater sequences, including the journey to Talokan and the mining mission at the start of the film. Weta had fewer shots, around 200, but they were all very complex, and they were our first partner. ILM did the bulk — over 450 shots — and was responsible for Wakanda and the Golden City. Digital Domain was mainly responsible for the third act, the battle on the ship at the end and the desert battle, along with a bit of the underwater stuff.

We also had some shared shots in the third act and various other vendors — including Rise, who had the third most shots. In the end we had over 17 vendors doing additional VFX, including Storm, Perception, SDFX, Luma, Base, Barnstorm, Basilic Fly, Digikore, Mist and Studio8. I’ve worked on many big movies, but this has to be one of the most complex and challenging ever in terms of what we did.

Chris White, Weta FX’s VFX Supervisor

Can you break down the main VFX sequences you worked on?
Our focus was on the sequences that took place deep underwater. This included the city of Talokan, the mining mission at the beginning of the film, the cenote dives of the shaman and Nakia, and the deep-water shots in the third act battle.

What were the big challenges of creating so many underwater VFX?
Building the underwater city of Talokan was the biggest challenge, as water work inherently requires both technical and creative problem-solving. Our goal was to stay true to the look of deep-water environments while having the creative freedom to tell the story.

How long did the whole process take? How many people were involved?
Work on the film lasted over 24 months with a crew of close to 400. Early days of the project were directed toward research, building tools, previz and look development.

What tools did you use?
For our water rendering and simulation, we used proprietary in-house tools. Our spectral renderer, Manuka, provided realistic renders of our environments, characters and vehicles. Simulations for clothing, marine snow, bubbles and explosion effects were also created with proprietary tools.

Did you use any cutting-edge technology?
We developed new tools to digitally replicate the real-world lens effects. These tools allowed us to give our CG shots the same look and feel as the shot footage.

What was the most difficult VFX sequence to do and why?
The most challenging sequence was the journey through the city of Talokan. In a short amount of time, we needed to get a sense of the culture, explore the city and meet the inhabitants, all while maintaining a realistic, deep-underwater look. Pre- and post-visualization tools allowed us to sketch out this journey quickly, ensuring we hit the right story beats and maintained the pace and intention of the sequence.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

ILM Expands StageCraft LED Virtual Production to Vancouver

Visual effects and virtual production studio Industrial Light & Magic (ILM) is expanding its StageCraft LED virtual production services to Vancouver. The stage, at over 20,000 square feet, is expected to house one of the largest StageCraft LED volumes in North America. This will be ILM’s fifth permanent volume, supporting the growing demand for its StageCraft services and facilities around the globe. In addition to Vancouver, ILM runs three stages in Los Angeles as well as one at Pinewood Studios in London.

“Our Canadian artists have contributed to digital set construction as well as virtual art department work on a number of StageCraft projects, so it’s really exciting that we will soon have a volume in our own backyard,” says Spencer Kent, executive in charge of ILM’s Vancouver studio. “I’m also proud that we are actively recruiting with an eye towards hiring people from underrepresented communities who we can train in the field of virtual production, strengthening our commitment to diversifying our industry. This effort will be bolstered when we launch our upcoming Jedi Training Academy in January.”

ILM won two Emmy Awards for its real-time VFX work for the first two seasons of Lucasfilm’s Disney+ series The Mandalorian. ILM is currently in production on three episodic series and a feature film on its existing StageCraft volumes in addition to daily bookings for smaller productions, including commercials, music videos and product marketing pieces.

ILM also builds bespoke StageCraft volumes for projects with unique requirements such as George Clooney’s The Midnight Sky and Taika Waititi’s upcoming Thor: Love and Thunder, which were built on stages in London and Sydney, respectively. In addition, Disney Television Studios and ILM recently opened an additional StageCraft LED volume on the Disney lot in Burbank, built for episodic television production.

THE MIDNIGHT SKY (2020)
Cr: Philippe Antonello/NETFLIX

Says Rob Bredow, SVP/CCO of ILM, “Productions can leverage our proprietary Helios cinema engine or Unreal Engine within our StageCraft toolset and have the flexibility to create photoreal content with real-time flexibility for a wide range of shows.”

ILM intends for the Vancouver StageCraft volume to be operational in the spring of 2022. The company is actively recruiting for numerous virtual production roles, including talent specializing in virtual art department, digital environments, technical operations and other real-time visual effects positions leveraging the StageCraft toolset.

 

ILM Boosts Animation Team

ILM Boosts Animation Team, Rehires Rob Coleman, Randal Shore

Industrial Light & Magic (ILM) has rehired Rob Coleman, who joins as key creative for feature animation at ILM’s Sydney studio. ILM also announced the return of Randal Shore, who will be rejoining the company’s Vancouver studio as an EP in feature animation.

Both Coleman and Shore are joining at a crucial chapter of innovation at ILM, overseen by Janet Lewin, who now helms the studio as both SVP and GM alongside Rob Bredow, ILM’s SVP and CCO. These additions come on the heels of the expansion of ILM’s virtual production and StageCraft technology and the accelerated growth of the company’s studios worldwide.

“Rob Coleman has a long history at Industrial Light & Magic, having worked alongside George Lucas on the prequel trilogy and Dave Filoni on The Clone Wars,” says Bredow. “With a 35-year tenure in the industry, he is a terrific mentor and powerhouse animation supervisor, with endless enthusiasm for the craft. Rob’s many years of creative supervision in animation make him the ideal fit for our upcoming projects, rejoining us at the perfect time as we work on multiple animated features in addition to a large number of visual effects shows.”

Coleman had a long and celebrated history at ILM as an animation supervisor before focusing on work within Lucasfilm Animation. He was twice nominated for an Academy Award for his work on Attack of the Clones and The Phantom Menace and received two BAFTA Award nominations for his work on Men in Black and The Phantom Menace. Prior to this, Coleman worked as an animator on projects such as The Mask, The Indian in the Cupboard and Dragonheart, to name a few. He moved to Lucasfilm Animation in 2005, providing development leadership on Star Wars: The Clone Wars. Most recently, he worked as an animation director at Dr. D Studios on Happy Feet Two and as head of animation at Animal Logic, supporting The Lego Movie, The Lego Batman Movie and Peter Rabbit.

During his tenure at ILM, Shore was the executive in charge of the studio’s Vancouver location, playing an instrumental role in both launching the studio and managing its growth over the years, as well as providing executive leadership on No Time to Die, Jungle Cruise, Black Widow and The Mandalorian.

Prior to joining ILM, Shore was at two other Canadian visual effects houses: The Moving Picture Company (MPC) and Prime Focus. At MPC, Shore was head of production, overseeing films such as Life of Pi, which was awarded the Oscar for Best Achievement in Visual Effects in 2013. During his tenure at Prime Focus as executive producer, he oversaw the company’s expansion from 45 artists to over 150 and worked on such films as The Tree of Life and Tron: Legacy.

“Every project Randal oversees benefits from his collaborative stewardship, and we’re thrilled that his journey has brought him back home to ILM,” says Lewin. “Randal’s wide breadth of executive leadership and animation experience makes him the perfect person to step into this new role. He will be developing and growing our feature animation strategy while partnering with our clients to guide their projects through to completion.”

“Every project Randal oversees benefits from his collaborative stewardship, and we’re thrilled that his journey has brought him back home to ILM,” says Lewin. “Randal’s wide breadth of executive leadership and animation experience make him the perfect person to step into this new role. He will be developing and growing our feature animation strategy while partnering with our clients to guide their projects through to completion.”

Most recently, Shore was head of production for Tangent Animation, where he oversaw the upcoming Maya and the Three for Netflix, bringing him back to his creative roots of producing animation. With Coleman’s and Shore’s return to ILM, they will both play a key role in the company’s continued push into animation.

Main Image: (L-R) Rob Coleman and Randal Shore

 

ILM Expands Virtual Production, StageCraft Offerings

Industrial Light & Magic (ILM) has expanded its virtual production and StageCraft LED volume services. This, according to the company, is tied to an initiative to increase diversity in the industry by combining ILM’s growth in this area with a global trainee program geared for underrepresented VFX talent.

ILM’s existing StageCraft volume set at Manhattan Beach Studios was used for the Lucasfilm Disney+ series The Mandalorian and will soon be joined by a second permanent StageCraft volume set at the studio, servicing a variety of clients in the greater Los Angeles area. In addition, ILM is building a third permanent StageCraft volume at Pinewood Studios in London and a fourth large-scale custom volume at Fox Studios Australia to be used for Marvel’s feature Thor: Love and Thunder, directed by Taika Waititi.

ILM will also continue to provide “pop-up” custom volumes for clients as the company recently did for the Netflix production The Midnight Sky, directed by George Clooney.

An end-to-end virtual production solution, ILM StageCraft provides a continuous pipeline from initial exploration through scouting and art direction, traditional and technical previsualization, lighting and realtime production filming with the StageCraft LED volumes. In addition to The Mandalorian, an upcoming feature film also took advantage of the full complement of ILM StageCraft virtual production services.

Other projects such as Avengers: Endgame, Aquaman, Jurassic World: Fallen Kingdom, Battle at Big Rock, Rogue One: A Star Wars Story, Kong: Skull Island, Solo: A Star Wars Story, Ready Player One and Rango have used  aspects of the toolset as well.

The new stages offer vast improvements over the original LED volume developed for the first season of The Mandalorian in 2018. Physically, the new stages are larger, use substantially more LED panels than ILM’s original stage and offer both higher resolution and smooth wall-to-ceiling transitions — resulting in better lighting on set as well as many more in-camera finals. ILM’s proprietary solutions for achieving fidelity on the LED walls at scale allow for higher color fidelity, higher scene complexity and greater control and reliability.

“With StageCraft, we have built an end-to-end virtual production service for key creatives. Directors, production designers, cinematographers, producers and visual effects supervisors can creatively collaborate, each bringing their collective expertise to the virtual aspects of production just as they do with traditional production,” says Janet Lewin, SVP, GM ILM.

Janet Lewin

“Over the past five years, we have made substantial investments in both our rendering technology and our virtual production toolset,” adds Rob Bredow, CCO, ILM “When combined with Industrial Light & Magic’s visual effects talent, motion capture experience, facial capture via Medusa, Anyma, and Flux and the production technology developed by ILM’s newly integrated Technoprops team, we believe we have a unique offering for the industry.”

Alongside the new stages, ILM is rolling out a global talent development initiative through the company’s long-standing Jedi Academy training program. The program, which is part of the company’s larger Global Diversity & Inclusion efforts, offers paid internships and apprenticeships on productions with seasoned ILM supervisors and producers who serve as mentors. The program is intended to fill roles across the virtual production and VFX pipeline with those from traditionally underrepresented backgrounds;  ILM has posted expressions of interests for jobs across the spectrum, from virtual art department teams and production management to engineering and artist roles. The goal with this initiative is to attract diverse junior talent and create a pipeline for them to become future visual effects artists, technicians and producers who will be “ILM trained” and uniquely qualified to work in this new, innovative way of filmmaking.

“There is a widespread lack of diversity in the industry, and we are excited to leverage our global expansion in this game-changing workflow to hire and train new talent, providing viable, exciting, and rewarding jobs across many of our locations,” notes ILM VP, operations, Jessica Teach, who oversees the company’s Diversity and Inclusion initiatives. “We believe this program can have a multiplier effect, attracting even more diverse talent to the industry and creating a pipeline for visual effects careers. We know that bringing more diversity into the industry is a critical part of strengthening and expanding our storytelling potential.”

ILM expects to have the new stages up and running for production in London in February of 2021 and in Los Angeles in March, with a mix of projects from features to commercials in line to take advantage of them.

ILM and Stargate on VFX and Private Clouds

By Karen Moltenbrey

For a while now, there has been reluctance when it comes to a cloud-based visual effects workflow, especially in terms of content security. But when COVID-19 forced studios into a remote work situation, there was little choice – take the leap and trust the security measures or cease operation in the midst of deadlines. No choice, really, if a studio wants to stay in business.

Some, particularly smaller facilities, had their work cut out for them, cobbling together a solid, secure infrastructure to get back to business. Larger VFX houses, like ILM and Stargate Studios, were already using a private cloud that required very little adjustment to accommodate at-home artists.

“When it comes to working from home, everyone has had to try it, at least. From client studios to VFX vendors and post houses, they are realizing it can work and it can be secure,” says ILM’s Francois Chardavoine. “I’ve seen a lot of appetite at all levels to really revisit our approach to working this way. So yes, the shutdown has definitely been a catalyst to consider more of a cloud workflow.”

Here, ILM and Stargate discuss how their cloud setups and workflows enabled them to pivot and extend their setups to an all-remote workforce during the COVID-19 outbreak … and never miss a beat in the process.

Industrial Light & Magic

When studios around the globe had to shut down suddenly due to COVID-19, Industrial Light & Magic (ILM) was a few steps ahead of most facilities since it had already been leveraging a private cloud infrastructure for various workflows. Because of this, the studio was able to move close to 1,000 employees to a work-from-home scenario literally overnight, followed by another 1,000 globally within two weeks after that.

Francois Chardavoine

ILM has not added workstations, per se, in the cloud for artists to use directly, nor has the facility leveraged the setup for cloud computing to render and generate content. Rather, the studio runs services in the cloud. “In particular, we have certain systems around media review and asset sharing. We leverage it mostly for external collaboration rather than for internal workflows,” says Chardavoine, VP of technology for Lucasfilm.

ILM is synonymous with VFX blockbusters. Not only does the studio reign over the billion-dollar Star Wars films, it also plays major roles in other cutting-edge visual effects films, including The Irishman, Terminator: Dark Fate, Avengers: Endgame, Jurassic World: Fallen Kingdom and many others dating back to the early days of VFX blockbusters. Due to the sensitive nature of the content and the strict content security requirements imposed by clients and the MPAA and TPN, the visual effects house needs full control over the ecosystem that it manages since it is responsible and liable for its security.

“We can absolutely guarantee to our clients that our workflows are secure,” says Chardavoine. “Even when we extend into a public cloud, we treat it as an extension of an internal cloud, where we manage the software infrastructure entirely.” And even though all of ILM’s processes are primarily internal, within its private cloud, the studio has the ability to extend them into the public cloud when necessary. “Practically speaking, though, we haven’t needed to do that very often,” Chardavoine adds.

The Irishman

In a private-cloud setup such as this, the studio purchases and owns the equipment, and it’s in a location managed or owned by the studio. Thus, the company can leverage the equipment for other uses. For instance, ILM can run experimental machine-learning algorithms over the weekend, if the infrastructure is available, enabling them to push R&D in a way that would perhaps be too costly on a public cloud network, where users pay according to usage.

ILM has been steadily moving to a private cloud for years. In business for 45 years, the San Francisco location has been growing and evolving digitally over at least the past two decades, while its newer locations (Singapore, Vancouver, London and Sydney) were established from scratch. “Each time you add a new location, you have the opportunity to evaluate what you would do differently and improve on that. So every studio has had a slightly improved or newer version [of the cloud] than we had [at the one prior],” says Chardavoine.

From a security standpoint, Chardavoine refers to the ILM cloud as a “walled garden,” where users can collaborate behind those walls and access many things, but outside those walls, there is no penetration. “Essentially, everyone was at the office, which was inside the walls, where all the machines and the networking takes place,” he explains. “Once you push everyone outside those walls to work from home, you have to adapt your workflows in a way that remains as secure as possible while still allowing everybody to be productive and continue doing work.”

ILM Vancouver

Before the pandemic, ILM had data centers with rack-mounted machines at each location (San Francisco and Vancouver connect to the same one in San Francisco), which are either in the same building as the studio or in a nearby locale. Users either had computers under their desks at the studio or displays connected to workstations in those data centers. When the work-at-home notification occurred, the IT staff had a very small window to pivot.

ILM’s Singapore studio became an early test bed as the virus spread globally. “We wondered, what would we do if we had to turn on a dime and send everyone home? So, we ran a test there involving 30 people across all disciplines and told them to go work from home,” says Chardavoine. “We were fortunate to learn a lot from that experience, and it was much more successful than we had expected.”

Chardavoine attributes that success to changing as little as possible so as not to compromise all the secure workflows that already had been established. “We just wanted to have the employees sitting in a different location and looking at their screens in a different location and not change anything else,” he adds.

This was done using a common approach leveraging a virtual private network, or VPN, which creates an encrypted connection between the home user’s computer (a high-end machine is not required) and those at the ILM offices, which retained the same secure infrastructure that was always there.

Brave New World

“The employees are actually working on their machines that are still under their desks or in the data center at work,” Chardavoine says. “They are working on these remote machines and are streaming the pixels of the display back to where they are at home. It’s not really changing the workflow for anyone other than just having to connect from their home.”

Unlike the high-end hardware at the studio, at home, users only need a basic computer that can handle email, join a videoconferencing call, connect remotely and run antivirus and secure-connection software. The content-creation software, meanwhile, resides within the data center just as it had before the shutdown, as the artists are not working on local content.

When the shutdown occurred, ILM artists were working on approximately 20 projects ranging from the full scope of blockbusters to smaller portions of movies and even theme-park ride films – and they took on about a dozen more projects during this time. Some of the projects delivered during the shutdown include Peacock’s new series Brave New World and features such as Jungle Cruise and Free Guy, to name a few.

“We found that the number of shots we were delivering each week did not decrease at all. There were some hiccups that very first week as everyone was getting used to things, but then everybody hit their stride,” Chardavoine notes.

ILM Vancouver

There are many benefits of working remotely for both the artist and the company, including the ability to hire more diverse talent and to provide a more convenient work schedule. An employee survey found that after several weeks, artists often felt more productive and happier than they would have been otherwise.

Alas, a remote scenario is not for everyone, especially in a creative industry where artists thrive on human interaction. Also, it can have an adverse effect on company culture with a split workforce. And bandwidth problems still arise.

Still, not every function can be done remotely at this time. “We’re never going to be able to get around doing quality control in the conditions in which the media is meant to be viewed, especially for theatrical work that you need to see in high resolution on a big screen with the right Atmos audio — to make sure the lip movement of a CG character is truly aligned and synched with the audio you’d be hearing at the theater,” Chardavoine explains.

Stargate Studios

Stargate Studios began a cloud workflow about two decades ago – years before it became a recognizable industry practice. Thirty years ago, Sam Nicholson, ASC, founded Stargate in Los Angeles, later opening a second site in Vancouver quickly followed by multiple international studios. To facilitate work between the sites, he established a cross-facility transfer — in essence, setting up a private cloud network. Expansion continued as he built the studio into an international virtual production company with “brick and mortar” locations in seven locales around the world, in addition to four virtual sites.

Sam Nicholson

Prior to COVID-19, only about half the staff in the LA studio had been working on location. Then the virus hit and forced his employees into a remote work situation. “The VFX industry has proven to be remarkably resilient in the face of this crisis; we all went home and logged in to our computers, and it was business as usual,” Nicholson says.

While some facilities turn to public cloud services, especially for overflow situations when burst rendering is needed, such an option is expensive. Instead, Stargate uses its own machines across its various sites, configured in a private cloud, to render locally. “All the locations are connected together and trading horsepower, artists and software; we float licenses across the facilities. We just send instructions out and then render locally,” he explains. “We can combine all the horsepower of five or six locations into essentially one private cloud. That said, jobs will overflow sometimes, and then we turn to Amazon and use AWS to get, say, another 1,000 machines online for a week or however long we need. That kind of business model is much more efficient, and it’s much more flexible.”

The private cloud not only supports cross-facility rendering, but as Nicholson points out, cross-facility software and cross-facility artist availability, as the manpower itself and the tools and hardware are spread out. This eliminates the situation of having some artists in a location without any work while others struggle to finish jobs. It also provides Stargate with a presence in various local markets and enables cross-facility education, what Nicholson calls the “Stargate Exchange.”

Stargate using their ThruView system to playback a following train car in-camera for Run on set.

While Stargate had been enjoying the benefits of this private cloud network for quite some time, when COVID-19 hit and forced a remote work situation, the studio didn’t miss a step. “I told everybody to take a machine (each artist has three on average), go home, and we’d VPN all the computers together,” says Nicholson. So now, instead of having six basic locations — or even 10 — that employees are logging in from, we are now online with 200 individuals logging into our primary facilities in North America, Latin America, Europe and Asia.

The process to make that happen was practically seamless. As it turns out, Stargate had been VPN’ing artists into the network for some time, particularly matte painters, who typically have longer lead times and whose work involves much smaller files. “It’s so easy to remote in a matte painter and relatively easy to remote in 3D [artists]. It’s somewhat complex to remote 2D compositing, though,” says Nicholson. “But, if you start with matte painters, you get used to it, and then you get used to 3D. Eventually you say, ‘We can do this with all our 2D artists.’”

Shooting in-camera VFX for Run’s train scenes.

In fact, Nicholson believes it was a natural progression to remote in the staff; COVID-19 just made it happen sooner. Overnight, actually. “No one went into the facility. The power was left on and the lights are off, but all the machines stay active globally 24/7,” he says. “All the horsepower stays online, and people just remote in. It turns out to be, in some ways, more successful; I am able to communicate with my artists online from my home studio, and I can reach anyone any time of the day. It’s faster than walking down the hall and trying to find someone. And you can have three, four, five conversations going at once, which is a lot more difficult to do in person because you can get pigeon-holed into an office or a discussion. Working remotely, there are fewer distractions. Our artists, producers and staff can manage their time more effectively, which translates to being happier and more productive.”

HBO’s Run

Stargate uses industry-leading software, most of which is cloned across the locations, because the software configuration across the network has to be identical for the setup to work. “I can’t send you a file if you don’t have the software to run it,” Nicholson points out, adding that the machines at each location auto-reboot and update nightly.

Although the company uses Amazon AWS only as needed for burst rendering, it is employing that public cloud for data transfer and storage. In Los Angeles, Stargate keeps about 3PB of storage online to handle its extensive digital library of virtual locations, what it calls the Stargate Virtual Backlot.

“The concept of virtual production is something we’ve been working on for many years with virtualized locations,” says Nicholson. “We extensively cover a location with multiple cameras, wide shots, medium shots, close-ups, LIDAR and photogrammetry. This requires a tremendous amount of data, since you are shooting 360 with as many as eight cameras at once in 4K or 8K. You can spin up a lot of data quickly. For a typical shoot now, we come back with 200TB of data. We need to be able to store all that material and access it globally, and the best way to do that is the AWS cloud.”

As Nicholson points out, Stargate’s success lies within its ability to reinvent itself, which it has done a number of times, stepping into the digital realm from film, and now stepping from digital into the virtual realm, where they are using the cloud and virtual sets and embracing cutting-edge technologies, including its new ThruView process, which eliminates the need for greenscreen.

Helping The Resident go zero gravity.

It all starts with a stable, consistent workflow, which Stargate found with television productions. When the virus hit, Stargate LA and Toronto were in the middle of delivery on a series with HBO called Run and needed to deliver 10 episodes (3,000 ThruView shots) as well as hundreds of traditional 2D and 3D visual effects. According to Nicholson, HBO had called and asked if the work-at-home order would affect the delivery date, to which he responded, “No way, we’re going to hit all our deadlines.” And they did.

During the shutdown, Stargate Atlanta and Stargate Malta were working on other projects, including The Resident medical drama for Fox, which they were able to finish and deliver on time. Now, the delivery schedule has slowed somewhat, giving the company time to accelerate its R&D and tune portions of its pipeline to get ready for what Nicholson believes is going to be a tremendous upswing when production begins again in earnest.


Karen Moltenbrey is a veteran writer covering visual effects and post production.

Quick Chat: Compositor Jen Howard on her move from films to spots

By Randi Altman

Industry veteran Jen Howard started her career as a model maker before transitioning to a career as a compositor. After spending the last 20 years at ILM working on features — including Avatar, Pirates of the Caribbean: At World’s End, Transformers, Hulk and Jurassic World — she recently made the move to Carbon Chicago to work on commercials.

While Howard’s official title is Nuke compositor, she has been credited on films as digital artist, lead digital artist, sequence lead, compositing lead and sequence supervisor. We recently reached out to her to talk about her transition, her past and present. Enjoy!

While you specialize in Nuke, your official title is compositor. What does that title entail?
Regardless of what software package one uses, being a compositor entails marrying together many pieces of separately shot footage so that they appear to be part of a single image sequence captured at one time.

For realistic-style productions, these pieces of photography can include live-action plates, rendered creatures, rendered simulations (like smoke or water), actors shot against greenscreen, miniatures, explosions or other practical elements shot on a stage. For more stylistic productions that list might also include hand-drawn, stop motion or rendered animations.

Sounds fun as well as challenging.
Yes, compositing presents both technical and aesthetic challenges, and this is what I love about it. Each shot is both a math problem and an art problem.

Technically, you need to be able to process the image data in the gentlest way possible while achieving a seamless blend of all your elements. No matte lines, no layering mistakes, solid tracking, proper defocus and depth hazing. Whether or not you’ve done this correctly is easy to see by looking at the final image — there is largely a right and a wrong result. The tracked-in element is either sliding, or it isn’t. However, whether you’ve made the right aesthetic decisions is a trickier question.

The less quantifiable goal for all the artists on a shot is to manifest the director’s vision … to take the image in their head and put it on the screen. This requires a lot of verbal discussion about visuals, which is tricky. Sometimes there is production art, but often there isn’t. So what does it mean when the director says, “Make it more mysterious”? Or what if they don’t even know what they want? What if they do, but the people between the director and the artists can’t communicate that vision downstream clearly?

When you build an image from scratch, almost everything can be in play — composition, contrast, saturation, depth of field, the direction and falloff of lighting, the placement of elements to frame the action and direct the eye. It is a compositor’s job to interpret the verbal input they’ve received and know what changes to make to each of these parameters to deliver the visual look and feel the director is after and to tell their story.

What would surprise people the most about what falls under that title?
I think people are still surprised at how many aspects of an effects shot are in a compositor’s control, even today when folks are pretty tech-savvy. Between the person doing the lighting and rendering and the compositor, you can create any look. And they’re surprised at the amount of “hand work” it entails, as they imagine the process to be more automated than it is.

How long have you been working in visual effects?
During college, I became a production assistant for master model maker Greg Jein, and he taught me that craft. Interesting fact — the first lesson was how to get your fingers apart after you’ve glued them together. I worked building models until about 1997 then crossed over to the digital side. So that’s about 30 years, and it’s a good thing I’m sitting down as I say that.

Kong

How has the industry changed in the time you’ve been working? What’s been good? What’s been bad?
When I was a model maker, most of that work was happening in the LA area. The VFX houses with their own model shops and stages and the stand-alone model shops were there. There was also ILM in the Bay Area. These places drew on local talent. They had a regular pool of local freelancers who knew each other, and a lot of them fell into the field by accident..

I worked with welders, machinists and sci-fi geeks good at bashing model kits who ended up working at these places because someone there knew them, and the company needed their skill set. Then all of a sudden, they were in show business. There was a family feel to most shops, and it was always fun. Some shops were union, so the schedules for projects at those places mostly fit the scope of work, and late nights were rare. The digital world was the same for a long time.

Model shops mostly went away, and as everyone knows, most digital feature effects are now done overseas, with some tasks like roto and matchmoving entirely farmed out to separate smaller companies. Crews are from all over the globe, and I’d hazard a guess that those folks got into the industry on purpose because now it is a thing.

What we’ve gained with this new paradigm is a more diverse pool of new talent who can find their way into the industry pretty much no matter where they’re from. That makes me happy because I feel strongly that everyone who has a love for this kind of work should get a shot at trying it. They bring fresh vision and new ideas to the industry and an appetite for pushing the technology further.

What’s lost is the shorthand and efficiency you get from a crew that’s worked together for a long time. They’re older and have made a lot of the mistakes already and can cut to the chase quickly. They make great mentors for the younger artists when tapped for that job, but I don’t feel that there’s been the amount of knowledge transfer there could have been — in either direction. Sometimes an “us versus them” dynamic emerges, which is really unfortunate.

Another change is the increasingly compressed schedule of feature production, which creates long hours and weekend work. This is hard on everyone, both physically and emotionally. The stress can be intense and translates into work injuries and relationship tension and is extremely hard on families with children. Studios have been pushing for these shorter schedules and cheaper prices. VFX work has been moved to countries that offer tax breaks or a generally cheaper labor pool. So quality now takes a back seat two ways: There isn’t enough time, and sometimes there isn’t enough experience.

You recently made the move to Chicago and spot work after years at ILM working on features. Can you talk about the differences in workflows?
The powerful role of advertising agencies in commercial work really surprised me. In film, the director is king, and they’re there all the way through the project, making every creative decision. In advertising, it seems the director shoots and moves on, and the agency takes up the direction of the creative vision in post production.

The shorter timeline for spot work translates into less time for 3D artists to iterate and finesse their renders, which are time-intensive to run, and so the flexibility and faster turnaround of comp means more comp work on renders, sooner. In features, 3D artists ideally have the time to get their render to a place that they’re mostly happy with before comp steps in, and the comp touch can be pretty light. (Of course, feature timelines are becoming more compressed, so that’s not always true now.)

Did a particular film inspire you along this path?
Two words. Star Wars. (Not unusual I know.) Also, when I was older, Japanese anime. Starblazers (Yamato), specifically.

Growing up, I watched my mom struggle to make enough money to support us. She had to look for opportunity everywhere, taking whatever job was available. Mostly she didn’t particularly enjoy her jobs, and I noticed the price she paid for that – spending so many hours with people she didn’t enjoy, doing work that didn’t resonate for her. So it became very important for me to find work that I loved. It was a very conscious goal.

You mentioned school earlier. Was that film school?
Yes, I went to Cal Arts in Valencia, California, just outside of LA. I studied animation and motion graphics, but I discovered pretty quickly that I had no talent for animation. However, I became fascinated with the school’s optical printer and motion control camera, and I played a lot with those. The optical printer is the chemical way of compositing that was used before digital compositing was developed. Using those analog machines helped me understand digital compositing down the road.

Porche’s The Heist

Can you name some recent projects you’ve worked on?
My last ILM project was the new Star Wars ride that opened recently in Disneyland, called Rise of the Resistance. Other recent projects include Solo: A Star Wars Story, Transformers: The Last Knight, Kong: Skull Island and Bumblebee: The Movie.

While at Carbon I worked on a spot for Porche called The Heist and a Corona campaign.

What projects are you most proud of?
For model making, I’m proud of the work I did on Judge Dredd, which came out in 1995. I got to spend several months just detailing out a miniature city with little greebles — making up futuristic-looking antennae and spires to give the city more scale.

Batman

On the digital side I’m really proud of the look we developed for Rango, ILM’s one and only animated feature, directed by Gore Verbinski. We brought a lot of realistic cinematic zing to that world using some practical elements in combination with rendered layers, and we built comp into the process deliberately so we could dial to our hearts’ content.

I’m also extremely proud of the first three Pirates movies, in which we did something of the opposite — brought a fantasy world to reality. The pirate characters are extreme in their design, and it was especially rewarding to see them come to life.

Where do you find inspiration now?
Chicago is amazing. I’m a fan of architecture, and I have to say, this city knocks my socks off in that department. It is such a pleasure to live somewhere where so much thought has gone into the built environment. The Art Institute is constantly inspirational, and so is my backyard, which is full of bunnies and squirrels and my wife and our two kids.

What do you do to destress from it all, especially these days?
Well, we don’t really leave the house, so right now I mostly hide in the bathroom.

Any tips for folks just starting out?
– Do whatever you’re doing now to the best of your ability, even if it isn’t the job you ultimately want or even the field you want to be in. Relationships are key, and it can be surprising how someone you worked with 10 years ago can pop up suddenly in a position to help you out later on..

– Also, don’t be scared of software. Your most important asset is your ability to know what an image needs. You can learn any software.

– Start saving for retirement now.

As for me, I’m glad I didn’t know anything and that there was no internet or social media of significance until after I finished school. It meant I had to look inward to figure out what felt right, and that really worked for me. I wouldn’t want to spoil that.

ILM’s virtual production platform used on The Mandalorian

To bring The Mandalorian to life, Industrial Light & Magic (ILM) and Epic Games — along with production technology partners Fuse, Lux Machina, Profile Studios, Nvidia and ARRI — have introduced a new way to shoot VFX-heavy projects in collaboration with Jon Favreau’s Golem Creations.

The new virtual production workflow allows filmmakers to capture a significant amount of complex visual effects shots in-camera using realtime game engine technology (Epic’s Unreal Engine) and LED screens to represent dynamic photoreal digital landscapes and sets with creative flexibility previously unimaginable.

Also part of the news, ILM has made its new end-to-end virtual production solution, ILM StageCraft, available for use by filmmakers, agencies and showrunners worldwide.

Over 50 percent of The Mandalorian Season 1 was filmed using this new methodology, eliminating the need for location shoots entirely. Instead, actors in The Mandalorian performed in an immersive and massive 20-foot-high by 270-degree semicircular LED video wall and ceiling with a 75-foot-diameter performance space, where the practical set pieces were combined with digital extensions on the screens.

Digital 3D environments created by ILM played back interactively on the LED walls, edited in realtime during the shoot, which allowed for pixel-accurate tracking and perspective-correct 3D imagery rendered at high resolution via systems powered by Nvidia GPUs.

L-R: Jon Favreau and Richard Bluff

The environments were lit and rendered from the perspective of the camera to provide parallax in real time, as if the camera were really capturing the physical environment with accurate interactive light on the actors and practical sets, giving showrunner Favreau; executive producer/director Dave Filoni; visual effects supervisor Richard Bluff; cinematographers Greig Fraser and Barry “Baz” Idoine and the episodic directors the ability to make concrete creative choices for visual effects-driven work during photography and achieve realtime in-camera composites on set.

The technology and workflow required to make in-camera compositing and effects practical for on-set use combined the ingenuity of all the partners involved.

“We’ve been experimenting with these technologies on my past projects and were finally able to bring a group together with different perspectives to synergize film and gaming advances and test the limits of realtime, in-camera rendering,” explains Favreau adding, “We are proud of what was achieved and feel that the system we built was the most efficient way to bring The Mandalorian to life.”

“Merging our efforts in the space with what Jon Favreau has been working toward using virtual reality and game engine technology in his filmmaking finally gave us the chance to execute the vision,” says Rob Bredow, executive creative director and head of ILM. “StageCraft has grown out of the culmination of over a decade of innovation in the virtual production space at ILM. Seeing our digital sets fully integrated, in real time on stage, providing the kind of in-camera shots we’ve always dreamed of while also providing the majority of the lighting was really a dream come true.”

Bluff adds, “Working with Kim Libreri and his Unreal team, Golem Creations, and the ILM StageCraft team has opened new avenues to both the filmmakers and my fellow key creatives on The Mandalorian, allowing us to shoot principal photography on photoreal, virtual sets that are indistinguishable from their physical counterparts while incorporating physical set pieces and props as needed for interaction. It’s truly a game-changer.”

ILM StageCraft’s production tools provide filmmakers with the combination of traditional filmmaking equipment and methodologies with all of the advantages of a fully digital workflow. With ILM StageCraft, a production can acquire many in-camera finals, allowing filmmakers immediate and complete creative control of work typically handed off and reinterpreted in post, improving the quality of visual effects shots with perfectly integrated elements and reducing visual effects requirements in post, which is a major benefit considering today’s compressed schedules.

ILM’s Pablo Helman on The Irishman‘s visual effects

By Karen Moltenbrey

When a film stars Robert De Niro, Joe Pesci and Al Pacino, well, expectations are high. These are no ordinary actors, and Martin Scorsese is no ordinary director. These are movie legends. And their latest project, Netflix’s The Irishman, is no ordinary film. It features cutting-edge de-aging technology from visual effects studio Industrial Light & Magic (ILM) and earned the film’s VFX supervisor, Pablo Helman, an Oscar nomination.

The Irishman, adapted from the book “I Heard You Paint Houses,” tells the story of an elderly Frank “The Irishman” Sheeran (De Niro), whose life is nearing the end, as he looks back on his earlier years as a truck driver-turned-mob hitman for Russell Bufalino (Pesci) and family. While reminiscing, he recalls the role he played in the disappearance of his longtime friend, Jimmy Hoffa (Al Pacino), former president of the Teamsters, who famously disappeared in 1975 at the age of 62, and whose body has never been found.

The film contains 1,750 visual effects shots, most of which involve the de-aging of the three actors. In the film, the actors are depicted at various stages of their lives — mostly younger than their present age. Pacino is the least aged of the three actors, since he enters the story about a third of the way through — from the 1940s to his disappearance three decades later. He was 78 at the time of filming, and he plays Hoffa at various ages, from age 44 to 62. De Niro, who was 76 at the time of filming, plays Sheeran at certain points from age 20 to 80. Pesci plays Bufalino between age 53 and 83.

For the significantly older Sheeran, during his introspection, makeup was used. However, making the younger versions of all three actors was much more difficult. Indeed, current technology makes it possible to create believable younger digital doubles. But, it typically requires actors to perform alone on a soundstage wearing facial markers and helmet cameras, or requires artists to enhance or create performances with CG animation. That simply would not do for this film. Neither the actors nor Scorsese wanted the tech to interfere with the acting process in any way. Recreating their performances was also off the table.

“They wanted a technology that was non-intrusive and one that would be completely separate from the performances. They didn’t want markers on their faces, they did not want to wear helmet cams and they did not want to wear the gray [markered] pajamas that we normally use,” says VFX supervisor Helman. “They also wanted to be on set with theatrical lighting, and there wasn’t going to be any kind of re-shoots of performances outside the set.”

In a nutshell, ILM needed a markerless approach that occurred on-set during filming. To this end, ILM spent two years developing Flux, a new camera system and software, whereby a three-camera rig would extract performance data from lighting and textures captured on set and translate that to 3D computer-generated versions of the actors’ younger selves.

The camera rig was developed in collaboration with The Irishman’s DP, Rodrigo Prieto, and camera maker ARRI. It included two high-resolution (3.8K) Alexa Mini witness cameras that were modified with infrared rings; the two cameras were attached to and synched up with the primary sensor camera (the director’s Red Helium 8K camera). The infrared light from the two cameras was necessary to help neutralize any shadows on the actors’ faces, since Flux does not handle shadows well, yet remained “unseen” by the production camera.

Flux, meanwhile, used that camera information and translated that into deformable geometry mesh. “Flux takes that information from the three cameras and compares it to the lighting on set, deforms the geometry and changes the geometry and the shape of the actors on a frame-by-frame basis,” says Helman.

In fact, ILM continued to develop the software as it was working on the film. “It’s kind of like running the Grand Prix while you’re building the Ferrari,” Helman adds. “Then, you get better and better, and faster and faster, and your software gets better, and you are solving problems and learning from the software. Yes, it took a long time to do, but we knew we had time to do it and make it work.”

Pablo Helman (right) on The Irishman set.

At the beginning of the project, prior to the filming, the actors were digitally scanned performing a range of facial movements using ILM’s Medusa system, as well as on a light stage, which captured texture info under different lighting conditions. All that data was then used to create a 3D contemporary digital double of each of the actors. The models were sculpted in Autodesk’s Maya and with proprietary tools running on ILM’s Zeno platform.

ILM applied the 3D models to the exact performance data of each actor captured on set with the special camera rig, so the physical performances were now digital. No keyframe animation was used. However, the characters were still contemporary to the actors’ ages.

As Helman explains, after the performance, the footage was returned to ILM, where an intense matchmove was done of the actors’ bodies and heads. “The first thing that got matchmoved was the three cameras that were documenting what the actor was doing in the performance, and then we matchmoved the lighting instruments that were lighting the actor because Flux needs that lighting information in order to work,” he says.

Helman likens Flux to a black box full of little drawers where various aspects are inserted, like the layout, the matchimation, the lighting information and so forth, and it combines all that information to come up with the geometry for the digital double.

The actual de-aging occurs in modeling using a combination of libraries that were created for each actor and connected to and referenced by Flux. Later, modelers created the age variations, starting with the youngest version of each person. Variants were then generated gradually using a slider to move through life’s timeline. This process was labor-intensive as artists had to also erase the effects of time, such as wrinkles and age spots.

Insofar as The Irishman is not an action movie, creating motion for decades-younger versions of the characters was not an issue. However, a motion analyst was on set to work with the actors as they played the younger versions of their characters. Also, some visual effects work helped thin out the younger characters.

Helman points out that Scorsese stressed that he did not want to see a younger version of the actors playing roles from the past; he wanted to see younger versions of these particular characters. “He did not want to rewind the clock and see Robert De Niro as Jimmy Conway in 1990’s Goodfellas. He wanted to see De Niro as a 30-year-younger Frank Sheeran,” he explains.

When asked which actor posed the most difficulty to de-age, Helman explains that once you crack the code of capturing the performance and then retargeting the performance to a younger variation of the character, there’s little difference. Nevertheless, De Niro had the most screen time and the widest age range.

Performance capture began about 15 years ago, and Helman sees this achievement as a natural evolution of the technology. “Eventually those [facial] markers had to go away because for actors, that’s a very interesting way to work, if you really think about it. They have to try to ignore the markers and not be distracted by all the other intrusive stuff going on,” Helman says. “That time is now gone. If you let the actors do what they do, the performances will be so much better and the shots will look so much better because there is eye contact and context with another actor.”

While this technology is a quantum leap forward, there are still improvements to be made. The camera rig needs to get smaller and the software faster — and ILM is working on both aspects, Helman says. Nevertheless, the accomplishment made here is impressive and groundbreaking — the first markerless system that captures performance on set with theatrical lighting, thanks to more than 500 artists working around the world to make this happen. As a result, it opens up the door for more storytelling and acting options — not only for de-aging, but for other types of characters too.

Commenting on his Oscar nomination, Helman said, “It was an incredible, surreal experience to work with Scorsese and the actors, De Niro, Pacino and Pesci, on this movie. We are so grateful for the trust and support we got from the producers and from Netflix, and the talent and dedication of our team. We’re honored to be recognized by our colleagues with this nomination.”


Karen Moltenbrey is a veteran writer, covering visual effects and post production.

The Irishman editor Thelma Schoonmaker

By Iain Blair

Editor Thelma Schoonmaker is a three-time Academy Award winner who has worked alongside filmmaker Martin Scorsese for almost 50 years. Simply put, Schoonmaker has been Scorsese’s go-to editor and key collaborator over the course of some 25 films, winning Oscars for Raging Bull, The Aviator and The Departed. The 79-year-old also received a career achievement award from the American Cinema Editors (ACE).

Thelma Schoonmaker

Schoonmaker cut Scorsese’s first feature, 1967’s Who’s That Knocking at My Door, and since 1980’s Raging Bull has worked on all of his features, receiving a number of Oscar nominations along the way. There are too many to name, but some highlights include The King of Comedy, After Hours, The Color of Money, The Last Temptation of Christ, Goodfellas, Casino and Hugo.

Now Scorsese and Schoonmaker have once again turned their attention to the mob with The Irishman, which was nominated for 10 Academy Awards, including one for Shoonmaker’s editing work. Starring Robert De Niro, Al Pacino and Joe Pesci, it’s an epic saga that runs 3.5 hours and focuses on organized crime in post-war America. It’s told through the eyes of World War II veteran Frank Sheeran (De Niro). He’s a hustler and hitman who worked alongside some of the most notorious figures of the 20th century. Spanning decades, the film chronicles one of the greatest unsolved mysteries in American history, the disappearance of legendary union boss Jimmy Hoffa. It also offers a monumental journey through the hidden corridors of organized crime — its inner workings, rivalries and connections to mainstream politics.

But there’s a twist to this latest mob drama that Scorsese directed for Netflix from a screenplay by Steven Zaillian. Gone are the flashy wise guys and the glamour of Goodfellas and Casino. Instead, the film examines the mundane nature of mob killings and the sad price any survivors pay in the end.

Here, Schoonmaker — who in addition to her film editing works to promote the films and writings of her late husband, famed British director Michael Powell (The Red Shoes, Black Narcissus) — talks about cutting The Irishman, working with Scorsese and their long and storied collaboration.

The Irishman must have been very challenging to cut, just in terms of its 3.5-hour length?
Actually, it wasn’t very challenging to cut. It came together much more quickly than some of our other films because Scorsese and Steve Zaillian had created a very strong structure. I think some critics think I came up with this structure, but it was already there in the script. We didn’t have to restructure, which we do sometimes, and only dropped a few minor scenes.

Did you stay in New York cutting while he shot on location, or did you visit the set?
Almost everything in the The Irishman was shot in or around New York. The production was moving all over the place, so I never got to the set. I couldn’t afford the time.

When I last interviewed Marty, he told me that editing and post are his favorite parts of filmmaking. When the two of you sit down to edit, is it like having two editors in the room rather than a director and his editor?
Marty’s favorite part of filmmaking is editing, and he directs the editing after he finishes shooting. I do an assembly based on what he tells me in dailies and what I feel, and then we do all the rest of the editing together.

Could you give us some sense of how that collaboration works?
We’ve worked together for almost 50 years, and it’s a wonderful collaboration. He taught me how to edit at first, but then gradually it has become more of a collaboration. The best thing is that we both work for what is best for the film — it never becomes an ego battle.

How long did it take to edit the film, and what were the main challenges?
We edited for a year and the footage was so incredibly rich: the only challenge was to make sure we chose the best of it and took advantage of the wonderful improvisations the actors gave us. It was a complete joy for Scorsese and me to edit this film. After we locked the film, we turned over to ILM so they could do the “youthifying” of the actors. That took about seven months.

Could you talk about finding the overall structure and considerable use of flashbacks to tell the story?
Scorsese had such a strong concept for this film — and one of his most important ideas was to not explain too much. He respects the audience’s ability to figure things out themselves without pummeling them with facts. It was a bold choice and I was worried about it, frankly, at first. But he was absolutely right. He didn’t want the film to feel like a documentary. He wanted to use brushstrokes of history just to show how they affected the characters. The way the characters were developed in the film, particularly Frank Sheeran, the De Niro character, was what was most important.

Could you talk about the pacing, and how you and Marty kept its momentum going?
Scorsese was determined that The Irishman would have a slower pace than many films today. He gave the film a deceptive simplicity. Interestingly, our first audiences had no problem with this — they became gripped by the characters and kept saying they didn’t mind the length and loved the pace. Many of them said they wanted to see the film again right away.

There are several slo-mo sequences. Could you talk about why you used them and to what effect?
The Phantom camera slo-motion wedding sequence (250fps) near the end of the film was done to give the feeling of a funeral, instead of a wedding, because the DeNiro character has just been forced to do the worst thing he will ever do in his life. Scorsese wanted to hold on De Niro’s face and evoke what he is feeling and to study the Italian-American faces of the mobsters surrounding him. Instead of the joy a wedding is supposed to bring, there is a deep feeling of grief.

What was the most difficult sequence to cut and why?
The montage where De Niro repeatedly throws guns into the river after he has killed someone took some time to get right. It was very normal at first — and then we started violating the structure and jump cutting and shortening until we got the right feeling. It was fun.

There’s been a lot of talk about the digital de-aging process. How did it impact the edit?
Pablo Helman at ILM came up with the new de-aging process, and it works incredibly well. He would send shots and we would evaluate them and sometimes ask for changes — usually to be sure that we kept the amazing performances of De Niro, Pacino and Pesci intact. Sometimes we would put back in a few wrinkles if it meant we could keep the subtlety of De Niro’s acting, for example. Scorsese was adamant that he didn’t want to have younger actors play the three main parts in the beginning of the film. So he really wanted this “youthifying” process to work — and it does!

There’s a lot of graphic violence. How do you feel about that in the film?
Scorsese made the violence very quick in The Irishman and shot it in a deceptively simple way. There aren’t any complicated camera moves and flashy editing. Sometimes the violence takes place after a simple pan, when you least expect it because of the blandness of the setting. He wanted to show the banality of violence in the mob — that it is a job, and if you do it well, you get rewarded. There’s no morality involved.

Last time we talked, you were using the Lightworks editing system. Do you still use Lightworks, and if so, can you talk about the system’s advantages for you?
I use Lightworks because the editing surface is still the fastest and most efficient and most intuitive to use. Maintaining sync is different from all other NLE systems. You don’t correct sync by sync lock — if you go out of sync, Lightworks gives you a red icon with a number of frames that you are out of sync. You get to choose where you want to correct sync. Since editors place sound and picture on the timeline, adjusting sync where you want to adjust the sync is much more efficient.

You’ve been Marty’s editor since his very first film — a 50-year collaboration. What’s the secret?
I think Scorsese felt when he first met me that I would do what was right for his films — that there wouldn’t be ego battles. We work together extremely well. That’s all there is to it. There couldn’t be a better job.

Do you ever have strong disagreements about the editing?
If we do have disagreements, which is very rare, they are never strong. He is very open to experimentation. Sometimes we will screen two ways and see what the audience says. But that is very rare.

What’s next?
A movie about the Osage Nation in Oklahoma, based on the book “Killers of the Flower Moon” by David Grann.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

VFX vet Rob Bredow takes helm at ILM, Gretchen Libby upped to VP

Industrial Light & Magic (ILM) has named Rob Bredow as SVP, executive creative director and head of ILM, which is a division of Lucasfilm. Bredow will be in charge of all of ILM’s four studios worldwide — in San Francisco, London, Singapore and Vancouver — and report to Lucasfilm GM Lynwen Brennan.

In addition, it was announced that Gretchen Libby has been promoted to VP, marketing and production. She will report to Bredow. More on Libby in a bit.

Bredow joined ILM as a VFX supervisor in 2014 and shortly thereafter was named VP of new media and head of Lucasfilm’s Advanced Development Group. He helped launch a new division, ILMxLAB, in 2015, combining the talents of Lucasfilm, ILM, and Skywalker Sound to develop, create, and release story-based immersive entertainment.

In 2016, Bredow was promoted to CTO of Lucasfilm, overseeing technical operations and partnerships as well as the company’s technology roadmap. Currently, Bredow is working as the visual effects supervisor and co-producer on Solo: A Star Wars Story directed by Ron Howard, which releases on May 25, 2018.

Prior to joining ILM, Bredow was the CTO and visual effects supervisor at Sony Pictures Imageworks. He has worked on films such as Independence Day, Godzilla, Stuart Little, Castaway, Surf’s Up, Cloudy With a Chance of Meatballs and many others.

He is a member of the Academy of Motion Pictures Arts & Sciences (Visual Effects Branch) and the AMPAS Scientific and Technical Council and, in 2010, was nominated for a Visual Effects Society Award Outstanding Effects Animation in an Animated Feature Motion Picture.

Gretchen Libby

Libby started at ILM in 1997 as a production manager. A year later, she was promoted to associate visual effects producer for A Perfect Storm and then to visual effects producer on Star Wars: Attack of the Clones two years later. In her previous role, Libby had focused on the company’s global expansion, which included opening studios in Singapore, Vancouver and London, and was the key marketing point of contact for ILM’s clients. Libby’s focus will remain on client marketing, overseeing all global production and strategic relationships. Prior to ILM, Libby worked in visual effects film production at Pacific Data Images in Palo Alto, California, and in visual effects commercial production in New York.

Libby is a member of the Producers Guild of America and formerly served on the board of directors of the Visual Effects Society of which she remains a member. She is also a member of Women in Film and has served as a producer on 29 feature films, eight of which received Academy Award nominations for visual effects.

Oscar-winner Jeff White is now CD at ILM Vancouver

Oscar-winning visual effects supervisor Jeff White has been named creative director of Industrial Light & Magic’s Vancouver studio. A 16-year ILM veteran, White will work directly with ILM Vancouver executive in charge Randal Shore.

Recently, the Academy of Motion Picture Arts and Sciences honored White and three colleagues, (Jason Smith, Rachel Rose, Mike Jutanwith a Technical Achievement Award for his original design of ILM’s procedural rigging system, Block Party. He is also nominated for an Academy Award for Visual Effects for his contribution to Kong: Skull Island.

White joined Industrial Light & Magic in 2002 as a creature technical director, working on a variety of films, including the Academy Award-winning Pirates of the Caribbean: Dead Man’s Chest, as well as War of the Worlds and Star Wars: Episode III: Revenge of the Sith.

In 2012, White served as the ILM VFX supervisor on Marvel’s The Avengers, directed by Joss Whedon, and earned both Oscar and BAFTA nominations for his visual effects work. He also received the Hollywood Film Award for visual effects for the work. White was also a VFX supervisor on Duncan Jones’ 2016 sci-fi offering, Warcraft, based on the well-known video game World of Warcraft by Blizzard Entertainment.

Says White, “Having worked with many of the artists here in Vancouver on a number of films, including Kong: Skull Island, I know firsthand the amazing artistic and technical talent we have to offer and I couldn’t be more excited to share what I know and collaborate with them on all manner of projects.”

Initially conceived as a satellite office when it opened in 2012, ILM’s Vancouver studio became a permanent fixture in the company’s operation in 2014. In 2017, the studio nearly doubled in size, adding a second building adjacent to its original location in the Gastown district. The studio has spearheaded ILM’s work on such films as Valerian and the City of a Thousand Planets, Only the Brave and most recently, Ryan Coogler’s Black Panther and Ava DuVernay’s A Wrinkle in Time.

Sci-Tech Award winners named

The 2018 Sci-Tech Awards (Academy of Motion Picture Arts and Sciences) have been bestowed to 34 individuals and one company representing 10 scientific and technical achievements. Each recipient will be honored at the annual Scientific and Technical Awards Presentation on February 10 at the Beverly Wilshire in Beverly Hills.

“This year we are happy to honor a very international group of technologists for their innovative and outstanding accomplishments,” says Ray Feeney, Academy Award recipient and chair of the Scientific and Technical Awards Committee. “These individuals have significantly contributed to the ongoing evolution of motion pictures and their efforts continue to empower the creativity of our industry.”

Technical Achievement Award Winners (Academy Certificates)

Honorees: Jason Smith and Jeff White for the original design, and to Rachel Rose and Mike Jutan for the architecture and engineering of the BlockParty procedural rigging system at Industrial Light & Magic.

BlockParty streamlines the rigging process through a comprehensive connection framework, a unique graphical user interface and volumetric rig transfer. This has enabled ILM to build richly detailed and unique creatures while greatly improving artist productivity.

Honorees: Joe Mancewicz, Matt Derksen and Hans Rijpkema for the design, architecture and implementation of the Rhythm & Hues Construction Kit rigging system.

This toolset provides a new approach to character rigging that features topological independence, continuously editable rigs and deformation workflows with shape-preserving surface relaxation, enabling 15 years of improvements to production efficiency and animation quality.

Honorees: Alex Powell for the design and engineering and to Jason Reisig for the interaction design, and to Martin Watt and Alex Wells for the high-performance execution engine of the Premo character animation system at DreamWorks Animation.

Premo enables animators to pose full-resolution characters in representative shot context, significantly increasing their productivity.

Honorees: Rob Jensen for the foundational design and continued development and to Thomas Hahn for the animation toolset and to George ElKoura, Adam Woodbury and Dirk Van Gelder for the high-performance execution engine of the Presto Animation System at Pixar Animation Studios.

Presto allows artists to work interactively in scene context with full-resolution geometric models and sophisticated rig controls, and has significantly increased the productivity of character animators at Pixar.

Scientific and Engineering Award Winners (Academy Plaques)

Honorees: John Coyle, Brad Hurndell, Vikas Sathaye and Shane Buckham for the concept, design, engineering and implementation of the Shotover K1 camera system.

This six-axis stabilized aerial camera mount, with its enhanced ability to frame shots while looking straight down, enables greater creativity while allowing pilots to fly more effectively and safely.

Honorees: Jeff Lait, Mark Tucker, Cristin Barghiel and John Lynch for their contributions to the design and architecture of Side Effects Software’s Houdini visual effects and animation system.

Houdini’s dynamics framework and workflow management tools have helped it become the industry standard for bringing natural phenomena, destruction and other digital effects to the screen.

Honorees: Bill Spitzak and Jonathan Egstad for the visionary design, development and stewardship of Foundry’s Nuke compositing system.

Built for production at Digital Domain, Nuke is used across the motion picture industry, enabling novel and sophisticated workflows at an unprecedented scale.

Honorees: Abigail Brady, Jon Wadelton and Jerry Huxtable for their significant contributions to the architecture and extensibility of Foundry’s Nuke compositing system.

Expanded as a commercial product at The Foundry, Nuke is a comprehensive, versatile and stable system that has established itself as the backbone of compositing and image processing pipelines across the motion picture industry.

Honorees: Leonard Chapman for the overall concept, design and development, to Stanislav Gorbatov for the electronic system design, and to David Gasparian and Souhail Issa for the mechanical design and integration of the Hydrascope telescoping camera crane systems.

With its fully waterproof construction, the Hydrascope has advanced crane technology and versatility by enabling precise long-travel multi-axis camera movement in, out of and through fresh or salt water.

Academy Award of Merit (Oscar statuette)

Honorees: Mark Elendt and Side Effects Software for the creation and development of the Houdini visual effects and animation system.

With more than twenty years of continual innovation, Houdini has delivered the power of procedural methods to visual effects artists, making it the industry standard for bringing natural phenomena, destruction and other digital effects to the screen.

Gordon E. Sawyer Award (Oscar statuette)

Honoree: Jonathan Erland, visual effects technologist

Presented to an individual in the motion picture industry whose technological contributions have brought credit to the industry.

All images courtesy of A.M.P.A.S.

The importance of on-set VFX supervision

By Karen Maierhofer

Some contend that having a visual effects supervisor present on set during production is a luxury; others deem it a necessity. However, few, if any, see it as unnecessary.

Today, more and more VFX supes can be found alongside directors and DPs during filming, advising and problem-solving, with the goal of saving valuable time and expense during production and, later, in post.

John Kilshaw

“A VFX supervisor is on set and in pre-production to help the director and production team achieve their creative goals. By having the supervisor on set, they gain the flexibility to cope with the unexpected and allow for creative changes in scope or creative direction,” says Zoic Studios creative director John Kilshaw, a sought-after VFX supervisor known for his collaborative creative approach.

Kilshaw, who has worked at a number of top VFX studios including ILM, Method and Double Negative, has an impressive resume of features, among them The Avengers, Pirates of the Caribbean: On Stranger Tides, Mission: Impossible – Ghost Protocol and various Harry Potter films. More recently, he was visual effects supervisor for the TV series Marvel’s The Defenders and Iron Fist.

Weta Digital’s Erik Winquist (Apes trilogy, Avatar, The Hobbit: An Unexpected Journey) believes the biggest contribution a VFX supervisor can make while on set comes during prep. “Involving the VFX supervisor as early as possible can only mean less surprises during principal photography. This is when the important conversations are taking place between the various heads of departments. ‘Does this particular effect need to be executed with computer graphics, or is there a way to get this in-camera? Do we need to build a set for this, or would it be better for the post process to be greenscreen? Can we have practical smoke and air mortars firing debris in this shot, or is that going to mess with the visual effects that have to be added behind it later?’”

War for the Planet of the Apes via Weta Digital

According to Winquist, who is VFX supervisor on Rampage (2018), currently in post production, having a VFX supe around can help clear up misconceptions in the mind of the director or other department heads: “No, putting that guy in a green suit doesn’t make him magically disappear from the shot. Yes, replacing that sky is probably relatively straightforward. No, modifying the teeth of that actor to look more like a vampire’s while he’s talking is actually pretty involved.”

Both Kilshaw and Winquist note that it is not uncommon to have a VFX supervisor on set whenever there are shots that include visual effects. In fact, Winquist has not heard of a major production that didn’t have a visual effects supervisor present for principal photography. “From the filmmaker’s point of view, I can’t imagine why you would not want to have your VFX supervisor there to advise,” he says. “Film is a collaborative medium. Building a solid team is how you put your vision up on the screen in the most cost-effective way possible.”

At Industrial Light & Magic, which has a long list of major VFX film credits, it is a requirement. “We always have a visual effects supervisor on set, and we insist on it. It is critical to our success on a project,” says Lindy De Quattro, VFX supervisor at ILM. “Frankly, it terrifies me to think about what could happen without one present.”

Lindy De Quattro

For some films, such as Evan Almighty, Pacific Rim, Mission: Impossible — Ghost Protocol and the upcoming Downsizing, De Quattro spent an extended period on set, while for many others she was only present for a week or two while big VFX scenes were shot. “No matter how much time you have put into planning, things rarely go entirely as planned. And someone has to be present to make last-minute adjustments and changes, and deal with new ideas that might arise on that day — it’s just part of the creative process,” she says.

For instance, while working on Pacific Rim, Director Guillermo del Toro would stay up until the wee hours of the night making new boards for what would be shot the following day, and the next morning everyone would crowd around his hand-drawn sketches and notebooks and he would say, “OK, this is what we are shooting. So we have to be prepared and do everything in our power to help ensure that the director’s vision becomes reality on screen.”

“I cannot imagine how they would have gone about setting up the shots if they didn’t have a VFX supervisor on set. Someone has to be there to be sure we are gathering the data needed to recreate the environment and the camera move in post, to be sure these things, and the greenscreens, are set up correctly so the post is successful,” De Quattro says. If you don’t know to put in greenscreen, you may be in a position where you cannot extract the foreground elements the way you need to, she warns. “So, suddenly, two days of an extraction and composite turns into three weeks of roto and hair replacement, and a bunch of other time-consuming and expensive work because it wasn’t set up properly in initial photography.”

Sometimes, a VFX supervisor ends up running the second unit, where the bulk of the VFX work is done, if the director is at a different location with the first unit. This was the case recently when De Quattro was in Norway for the Downsizing shoot. She ended up overseeing the plate unit and did location scouting with the DP each morning to find shots or elements that could be used in post. “It’s not that unusual for a VFX supervisor to operate as a second unit director and get a credit for that work,” she adds.

Kilshaw often finds himself discussing the best way of achieving the show’s creative goals with the director and producer while on set. Also, he makes sure that the producer is always informed of changes that will impact the budget. “It becomes very easy for people to say, ‘we can fix this in post.’ It is at this time when costs can start to spiral, and having a VFX supervisor on set to discuss options helps stop this from happening,” he adds. “At Zoic, we ensure that the VFX supervisor is also able to suggest alternative approaches that may help directors achieve what they need.”

Erik Winquist

According to Winquist, the tasks a VFX supe does on set depends on the size of the budget and crew. In a low-budget production, a person might be doing a myriad of different tasks themselves: creating previs and techvis, working with the cinematographer and key grip concerning greenscreen or bluescreen placement, placing tracking markers, collecting camera information for each setup or take, shooting reference photos of the set, helping with camera or lighting placement, gathering lighting measurements with gray and chrome reference spheres — basically any information that will help the person best execute the visual effects requirements of the shot. “And all the while being available to answer questions the director might have,” he says.

If the production has a large budget, the role is more about spreading out and managing those tasks among an on-set visual effects team: data wranglers, surveyors, photographers, coordinators, PAs, perhaps a motion capture crew, “so that each aspect of it is done as thoroughly as possible,” says Winquist. “Your primary responsibility is being there for the director and staying in close communication with the ADs so that you or your team are able to get all the required data from the shoot. You only have one chance to do so.”

The benefits of on-set VFX supervision are not just for those working on big-budget features, however. As Winquist points out, the larger the budget, the more demanding the VFX work and the higher the shot count, therefore the more important it is to involve the VFX supervisor in the shoot. “But it could also be argued that a production with a shoestring budget also can’t afford to get it wrong or be wasteful during the shoot, and the best way to ensure that footage is captured in a way that will make for a cost-effective post process is to have the VFX supervisor there to help.”

Kilshaw concurs. “Regardless of whether it is a period drama or superhero show, whether you need to create a superpower or a digital version of 1900 New York, the advantages of visual effects and visual effects supervision on set are equally important.”

While De Quattro’s resume is overflowing with big-budget VFX films, she has also assisted on smaller projects where a VFX supervisor’s presence was also critical. She recalls a commercial shoot, one that prompted her to question the need for her presence. However, production hit a snag when a young actor was unable to physically accomplish a task during multiple takes, and she was able to step in and offer a suggestion, knowing it would require just a minor VFX fix. “It’s always something like that. Even if the shoot is simple and you think there is no need, inevitably someone will need you and the input of someone who understands the process and what can be done,” she says.

De Quattro’s husband is also a VFX supervisor who is presently working on a non-VFX-driven Netflix series. While he is not on set every day, he is called when there is an effects shoot scheduled.

Mission Impossible: Ghost Protocol

So, with so many benefits to be had, why would someone opt not to have a VFX supervisor on set? De Quattro assumes it is the cost. “What’s that saying, ‘penny wise and pound foolish?’ A producer thinks he or she is saving money by eliminating the line item of an on-set supervisor but doesn’t realize the invisible costs, including how much more expensive the work can be, and often is, on the back end,” she notes.

“On set, people always tell me their plans, and I find myself advising them not to bother building this or that — we are not going to need it, and the money saved could be better utilized elsewhere,” De Quattro says.

On Mission: Impossible, for example, the crew was filming a complicated underwater escape scene with Tom Cruise and finally got the perfect take, only his emergency rig became exposed. However, rather than have the actor go back into the frigid water for another take, De Quattro assured the team that the rig could be removed in post within the original scope of the VFX work. While most people are aware that can be done now, having someone with the authority and knowledge to know that for sure was a relief, she says.

Despite their extensive knowledge of VFX, these supervisors all say they support the best tool for the job on set and, mostly, that is to capture the shot in-camera first. “In most instances, the best way to make something look real is to shoot it real, even if it’s ultimately just a small part of the final frame,” Winquist says. However, when factors conspire against that, whether it be weather, animals, extras, or something similar, “having a VFX supervisor there during the shoot will allow a director to make decisions with confidence.”

Main Image: Weta’s Erik Winquist on set for Planet of the Apes.

More speakers added for Italy’s upcoming View Conference

More than 50 speakers are confirmed for 2017’s View Conference, a digital media conference that takes place in Turin, Italy, from October 23-27. Those speakers include six visual effects Oscar winners, two Academy Sci-Tech award winners, animated feature film directors, virtual reality pioneers, computer graphics researchers, game developers, photographers, writers and studio executives.

“One of the special reasons to attend View is that our speakers like to stay for the entire week and attend talks given by the other speakers, so our attendees have many opportunities to interact with them,” says conference director Dr. Maria Elena Gutierrez. “View brings together the world’s best and brightest minds across multiple disciplines, in an intimate and collaborative place where creatives can incubate and celebrate.”

Newly confirmed speakers include:

Scott Stokdyk- This Academy Award winner (VFX supervisor, Valerian and the City of a Thousand Planets) will showcase VFX from the film – from concept, design and inspiration to final color timing.

Paul Debevec – This Academy Award winner (senior staff engineer, Google VR, ICT) will give attendees a glimpse inside the latest work from Google VR and ICT.

Martyn Culpitt – A VFX supervisor on Logan and at Image Engine company, he will breakdown the film Logan, highlighting the visual effects behind Wolverine’s gripping final chapter.

Jan-Bart Van Beek – This studio art director at Guerrilla Games will take attendees through the journey that Guerrilla Games underwent to design the post-apocalyptic world of the game franchise, Horizon Zero Dawn.

David Rosenbaum – This chief creative officer at Cinesite Studios along with Cinesite EP Warren Franklin will present at talk titled, “It’s All Just Funny Business: Looking for IP, Talent ad Audiences.”

Elisabeth Morant – This product manager for Google’s Tilt Brush will discusses the company’s VR painting application in a talk called, “Real Decisions, Virtual Space: Designing for VR.”

Donald Greenberg – This professor of computer graphics at Cornell University will be discussing the “Next-gen of Virtual Reality”

Steve Muench – He will present “The Labor of Loving Vincent: Animating Van Gogh to Solve a Mystery.”

Deborah Fowler – This professor of visual effects at Savannah College of Art and Design/SCAD will showcase “Procedural and Production Techniques using Houdini.”

Daniele Federico: This co-founder and developer at Toolchefs will present “Make us Alive. An In-Depth Look at Atoms Crowd Software.”

Jason Bickerstaff – This character artist from Pixar Animation Studios) will present “Crossing The Dimensional Rift.”

Steve Beck – This VFX art director from ILM will discuss “The Future of Storytelling.”

Nancy Basi – She is executive director of the Film and Media Centre – Vancouver Economic Commission.

For a complete listing of speakers visit http://www.viewconference.it/speakers

 

Lucasfilm and ILM release open source MaterialX library

Lucasfilm and ILM have launched the first open source release of the MaterialX library for computer graphics. MaterialX is an open standard developed by Lucasfilm’s Advanced Development Group and ILM engineers to facilitate the transfer of rich materials and look-development content between applications and renderers.

Originated at Lucasfilm in 2012, MaterialX has been used by ILM on features including Star Wars: The Force Awakens and Rogue One: A Star Wars Story, as well as realtime immersive experiences such as Trials On Tatooine.

Workflows at computer graphics production studios require multiple software tools for different parts of the production pipeline, and shared and outsourced work requires companies to hand off fully look-developed models to other divisions or studios which may use different software packages and rendering systems.

MaterialX addresses the current lack of a common, open standard for representing the data values and relationships required to transfer the complete look of a computer graphics model from one application or rendering platform to another, including shading networks, patterns and texturing, complex nested materials and geometric assignments. MaterialX provides a schema for describing material networks, shader parameters, texture and material assignments and color-space associations in a precise, application-independent and customizable way.

MaterialX is an open source project released under a modified Apache license.