By Iain Blair
South African director Neill Blomkamp has always pushed the VFX envelope. After beginning his career as a visual effects artist in film and television, he charged onto the international scene with his 2009 directorial debut, District 9, a sci-fi thriller with a thinly disguised social commentary about apartheid. Made for just $30 million, it became a global hit, earning four Oscar nominations.
Blomkamp followed that up with two more VFX-heavy films — Elysium and Chappie — and has since been developing and making experimental projects for his independent company Oats Studios, including Rakka, Firebase and Zygote.
His latest film, Demonic, is a horror-thriller with a high-tech twist. It stars Carly Pope (The Good Doctor, Elysium) as a young woman who learns that her estranged mother has fallen into a coma and reluctantly agrees to take part in a therapy that will allow her to tap into her mother’s still-active brain and communicate with her.
The Demonic team included director of photography Byron Kopman, editor Austyn Daines and additional editor Julian Clarke. Viktor Muller was the VFX supervisor, and Universal Production Partners (UPP) provided the visual effects work.
I spoke with Blomkamp about making the film, the cutting-edge VFX and the challenges involved.
You used volumetric capture to create the simulation of the mother’s mindscape. How did that work?
It’s a pretty new technology that I don’t think has been used in a film to this extent before. It’s basically 3D video, where you can turn your actors into geometry — you have about 260 4K cameras arranged in a grid or dome, so they can shoot the actors from all points of view. All of that is then turned into 3D data.
Then we sent those sequences to UPP, our VFX company, and they’d drop them into the environments they were building, which were also based on photogrammetry images of real locations. Once we had the vol actor and the environment together, we could start to light them and choose camera angles with virtual cameras.
It sounds like a very complicated pipeline.
It was, and one of the most complicated aspects was the mathematics involved. If you imagine that the volumetric volume is 4 meters in diameter, and it’s a hemisphere, then the actor can only walk 4 meters — and only 3 of that is usable, as their back is against all the cameras on one side and then for the last meter they’re up against the camera lenses on the other side. So if an actor has to walk through a house, for instance, you have to figure out how many 3-meter distances are needed for the whole sequence. So dealing with all that as well as directing the acting made it a very arduous and difficult process.
Did you feel you’d jumped in the deep end, in terms of this complex pipeline?
Yes, but I always felt it would work and, honestly, it was very satisfying when the imagery was coming back. My favorite shot in the whole film is when Carly approaches her childhood home; we do this God’s-eye-view of her walking and all of the perspective is crushed. I just love that.
Can you talk about how early you started planning using this technique and how you integrated it with the rest of the workflow?
The whole process was very unusual since the movie was built around wanting to use volumetric capture. Originally, I thought I was going to do it for my company, Oats Studios, and release it online, as it was more experimental. I was already talking to Metastage in LA about doing vol cap when we had the idea for this film, and I thought maybe it could be a feature instead of a short. Then I wrote the script to justify using this somewhat glitchy-looking technology.
So vol cap was embedded in it from the very start, but once we committed to making the film, we couldn’t go to LA because the borders were shut due to COVID. I am based in Vancouver, so we began talking to VCS — a company here headed up by Tobias Chen — about the same process: building a rig and doing all the computations.
We knew way ahead of time — even before screenwriting, let alone production — what we’d be dealing with. Then I wrote the screenplay to use the vol cap prototype look in the story — like the experimental VR lab bit — then we shot live action. Then we had to build the rig to do the vol cap and shot all that over four days, and that turned into nine months of post just dealing with all the vol cap data and turning it into sequences.
Let’s talk about creating all the VFX with UPP and working with visual effects supervisor Viktor Muller.
I love them. They’re in Prague, and they took all the vol cap data that VCS captured in Vancouver, and Viktor’s incredible. He’s one of my favorite VFX guys, along with Chris Harvey, who was the VFX super on Chappie and the lynchpin who held the whole Oats VFX department together and hired all our artists.
Chris oversaw gathering all the photogrammetry of the on-set stuff, and then Viktor took over as the VFX super once all the assets went to his company — he owns and runs UPP. And we’d communicate almost daily on Zoom and look at all the updates together.
You used Unity?
Yes, the vol cap sequences are in Unity, a real-time engine, so you can watch them in VR and walk around them like a video game. So we were able to film them with virtual cameras in a way that was unlike normal virtual production, as it was actually truthful. What you saw in the camera was the final thing, so we just filmed it like a movie.
What about the scenes of the demon at the end? Did you use a thermal camera?
Yes, we shot it with a FLIR camera, and it’s all real, but we also used CG at the point where the demon comes out of the body on the ground. When it burns, that bit was traditional photography, but it was pretty interesting to do. We built a 7-foot-tall creature suit, just like an ’80s horror film, and we shot clean plates when the creature burns and dies, so we got this semi-translucent effect, which made it far more surreal. The flames were a mix of CG and real flames on set, along with a propane flame bar set in the pond it falls into. So we used that to also light the scene and Carly.
Tell us about post. Was it remote because of COVID?
Yes, totally remote. Normally, you’d be close to your VFX team, and you’d definitely be around your editor, the sound team, the DI and so on. But I wasn’t there for any of it, and we couldn’t do anything in the usual way.
It was all remote, but it actually worked quite well to my surprise — maybe 85% of normal. We did it all with Zoom as well as cineSync for all of the VFX.
You had two editors — Austyn Daines and Julian Clarke. How did you all work together, and what were the main editing challenges?
Julian started the film but then had to go off to cut another movie, so Austyn came on. I worked with him more extensively and for a longer period of time. He’s based out of LA at Rock Paper Scissors, and I’d worked with him on a lot of Oats projects. I love him — but we’ve still never physically met!
He’d use his company server to upload secure files, then I’d go over them, get on Zoom and we’d talk it out. Technically, the most difficult editing challenge was just dealing with all the vol cap stuff. It was insane.
We had witness cameras in there to try and give us a POV, but it was totally mental, and what you’re looking at is so uncinematic and unemotional. Then artistically, the goal was to convey this awful sense of dread and foreboding the whole time, and a lot of that depends on your cutting pace and not rushing scenes. We spent a very long time on editing, but a lot of that was dealing with all the VFX.
What about the DI?
That was all done at UPP, again remotely. I’d look at uncompressed high-res on my home office calibrated color monitor, so it was like being in the room. The DP and I wanted it all to look as realistic as possible, so there was no synthetic light, if at all possible, and everything was natural. Nothing was overly colorized.
What’s next?
A big sci-fi film I’ve written. I’m pretty excited about it.
Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.