NBCUni 9.5.23
Kevin Baillie

VFX Supe Kevin Baillie: Pinocchio‘s Virtual Production Workflow

Filmmaker Robert Zemeckis, whose latest film is Pinocchio, has always pushed the limits of visual effects and animation in such films as Who Framed Roger Rabbit, the Back to the Future trilogy, The Polar Express and A Christmas Carol. 

Pinocchio, which combines live action, CGI and virtual production, is the latest retelling of the tale of a wooden puppet who embarks on an adventure to become a real boy. The Disney film stars Tom Hanks as Geppetto, the woodcarver who builds and treats Pinocchio as if he were his real son, alongside Joseph Gordon-Levitt as Jiminy Cricket and Cynthia Erivo as Blue Fairy.

“Walt Disney was really clever,” explains Zemeckis. “He always looked for stories to make movies of that were pretty much impossible to do as live-action movies. They could be done very wonderfully as animation because he was able to do animated stories about talking animals and puppets, fairies and dwarves and things that would be impossible to do in live action. But now, since digital cinema has emerged, the puppet could be very much three-dimensional. It occurred to me that you could do a very plausible version of Pinocchio as a live-action movie. All of the visual effects learning I’ve had over the years went into making this movie.”

Kevin Baillie

Kevin Baillie

Zemeckis’ below-the-line team included his longtime collaborators DP Don Burgess, ASC; visual effects supervisor Kevin Baillie; and visual effects producer Sandra Scott.

I spoke with Baillie — whose credits include Pirates of the Caribbean: At World’s End, Night at the Museum, Superman Returns and Harry Potter and the Goblet of Fire — about creating all the VFX, some technical firsts and the groundbreaking virtual production pipeline.

What were the big challenges of creating the visual effects?
The biggest was realizing that it’s a live-action film that has to seamlessly intercut with scenes that look like live action, but which are totally digital… and scenes that are partially live action, partially digital and so on. So while trying to work through all that — especially as the early set designs were coming in — it was obvious that we needed to use a lot of virtual production to visualize the entire film before we committed to building a single set. That would allow us to figure out the game plan for the whole film so we could create it as efficiently as possible.

Who did the VFX?
All the visual effects were done by MPC, and we brought in a team of animators early on because we wanted to work with the animation leads in prep. Using the rough sets we’d designed and built in Unreal Engine, MPC blocked in every scene as if it was a stage play.

Mold3D in Burbank worked with the practical art department to build the sets, MPC did the animations and Halon brought all that together and created our virtual “stage team.” They were responsible for working directly with Bob, who filmed every scene in the movie with a virtual camera device.

So he shot the film twice?
Yes, and this was before we built any live-action sets. He edited it all together into a 100-minute-long version that we could look at for reference and strategize around. So he made the movie before we made the movie. In fact, we kind of made it three times. The brilliance of virtual production using all the latest technology in Unreal, like real-time raytracing, gave us a beautiful version of the film before we shot anything. That really engaged Bob and our DP, who could start working on lighting design before he shot a frame. This allowed our production designers to assess their sets before they even built them, so it was a pretty incredible tool.

By the time we began live action, which was shot at Cardington Studios in Bedford, England, we could look at the beautiful sets in Unreal and use camera-tracking technology to visualize in real time what our set extensions were going to be. So if we had a partial set built, the early cut of the film helped us determine which section of the set we needed to build.

Then we did real-time composites of these beautiful sets onto a bluescreen so our DP could identify the best camera composition for the shot. This meant we were doing less post tweaking in-camera in the VFX process. We then shot the whole movie with a rough comp of every shot we were going to be doing, and then, after it was all edited, we upgraded all the animation. But it still wasn’t the final animation, which meant Bob could be really fluid in the edit. Once that was locked, we went in and did all the final animation. The cool thing about the second version of the movie is that all that upgraded animation blocking allowed our live-action A-camera operator to go into the all-digital shots — which had no live-action components whatsoever — and use a camera-control device hooked into Unreal Engine. He worked with the stage team to do a final camera pass on all the animation blocking, so the final film has that human camera operator’s touch on everything, along with well-thought-out design and moves.

How long did thewhole process take?
Over two years, and the VFX alone took about a year to complete. We had around 1,000 people on the project at any given time, and we had over 100 animators.

It looks like every shot had some VFX?
Yes, there were just five or six shots in the whole film without some sort of VFX in the whole film. The amazing thing is that there are actually under 1,000 VFX shots in the film, because Bob likes long shots.

Kevin Baillie

I heard you used some cutting-edge technology on-set?
We used a new camera-tracking system that’s based on ultrawide band technology, like Apple AirTag is, which gave us a bit more freedom with the camera when we were trying to visualize all our set extensions in real time on-set. It’s called Racelogic, and it was developed for racing cars, which was my hobby, so I’d actually used it before. But this was the first time ever it’s been used for camera tracking, and it was an amazing tool for us.

What was the most difficult VFX sequence to do and why?
All of them were quite difficult in one way or another because we wanted to pay our respects to the original film and get as much of its character into our film as possible. So we ended up doing a hand-keyframe for every single character throughout the film rather than relying on motion capture or anything like that. That was challenging because we really crafted the acting of every digital character, just like Bob did in Who Framed Roger Rabbit. He wanted people to have the same experience. And the scenes with the sea monster were very challenging because they involved a lot of water.

Fair to say this was this was truly cutting-edge and a first for this workflow?
Definitely, in terms of the way we did it. I’m not aware of any other production that’s used virtual production this extensively — from the very start to the very end. Other projects do heavy previz, but maybe the directors aren’t as hands-on, so it all changes at the end anyway. Even if they’re involved, it sort of dead-ends in the edit, and it all has to be done from scratch in the VFX process.

Kevin Baillie

Our goal here was to make sure that any relevant element could flow all the way through to the final shot. We did that for two reasons. First, for efficiency. If people have worked really hard creating very cool stuff, we want to have that end up on the screen. But more importantly, this workflow (and virtual production in general) was all about giving the filmmakers – especially the director but also all the live-action department heads, like the DP – the chance to get as close to touching the final pixel as possible. It’s no longer a case of shooting the movie and then doing the cut in some black box at a VFX studio thousands of miles away, and when it comes back, you’re stuck with what you get. Now we can let the filmmakers interact with the VFX process along the way.

You’ve collaborated so often with Bob. How does this rank in terms of the level of difficulty?
(Laughs) Bob doesn’t do easy movies. There’s always a challenge, whether it’s the technology or the visual storytelling or the budget, and the VFX team is always on its toes. This was one of the biggest and most involved projects. Ultimately, as I said earlier, he had to make this film three times, which is a huge amount of work, but when we explained the whole process to him, he didn’t hesitate.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.