NBCUni 9.5.23

Battlesuit: Creating a pilot remotely, in realtime with Unreal Engine

By Randi Altman

Filmmaker Hasraf “HaZ” Dulull, who began his career as a visual effects artist and VFX supervisor, enjoys working on sci-fi projects. His reel is full of them, including his own films, such as The Beyond and 2036: Origin Unknown. Even the Disney show he directed, Fast Layne, featured an aspect of futurism to it thanks to a sophisticated talking car. He is even offering a master class on science-fiction filmmaking.

Hasraf “HaZ” Dulull

So it’s not surprising that Dulull’s most recent project also focuses on this genre. Dulull recently completed Battlesuit, a proof-of-concept pilot episode for a sci-fi animated series called The Theory. It’s based on the graphic novel of the same name and in development by the comic book publisher TPub Comics. The story centers on an astro-archaeologist who travels the universe looking for answers to help save humanity. In this particular story, she discovers the remains of a mech robot whose last memory will reveal what happened to the planet’s civilization.

Dulull used Epic’s realtime Unreal Engine to produce the pilot cost effectively and remotely — just before COVID-19 hit. The series is currently in development and being shopped around to different networks with Dulull attached as director and executive producer. You can see the proof of concept here.

OK, let’s find out more from Dulull.

How early did you get involved in Battlesuit?
Neil Gibson, who is the creator of the graphic novel “The Theory,” reached out to me last year October for a general coffee chat. He was a fan of my previous work and wanted to get some advice about moving TPub Comic’s IP into the world of film and TV. He gave me some graphic novels that he thought I would like, and “The Theory” was one of them. There was one story within the book that stood out for me — Battlesuit. We caught up once again and Neil mentioned that they were looking at creating proof of concepts for some of their IP being developed for TV and asked if I was interested in directing one. I requested Battlesuit.

Was it always meant to be animated?
Originally, it was planned for a live-action proof of concept, but due to budget and schedule constraints we knew there was no way we would be able to do the vision justice. And I really wanted to stay true to the graphic novel’s story, so I went away to rethink how I would pull it off. That was around the time when Netflix was putting out a lot of animated projects like Love, Death & Robots and Castlevania, so there was a huge rise in that market. So I went back to them with a pitch to do it as a pilot episode for an animated series.

Naturally, their reaction was that animation is waaay too expensive — you’d need an animation house and tons of time, etc. But I had already come up with a way of executing it using a realtime animation approach. I did a quick test scene using existing free assets to get my point across on what this will look like, but also to really know for myself if it could be done — this was the test scene I did. Their response was, whoa, if you can do that for the budget and in a 12-minute pilot episode duration, then go for it. It also helped that I put the test scene online and got great reactions from it. That gave TPub Comics, who financed Battlesuit, the confidence to move ahead.

What was your team size and duration of production and remote production?
The team on the actual animation was three, including myself. As the director I handled all the camera, layout, lighting and shot creations, which was great, as doing realtime animation gave me so much freedom and control. There was also Ronen Eytan, who was the technical Unreal Engine artist who put together a cool animation pipeline using a live link with his iPad to capture the face performance of the actors. Lastly, there was Andrea Tedeschi, a CG artist/generalist responsible for assets and environments. He has collaborated with me on all my projects right back since 2015 when I did my short film Sync, and since then he worked on my features and other stuff with me.

Outside of the animation team, I brought on music composer and sound designer Edward Patrick White, who had just finished delivering the score for the latest Xbox title Gears Tactics. Our voice actors included Nigel Barber (Mission: Impossible – Rogue Nation, Spectre and my feature film The Beyond) and Kosha Engler, who has done performance for video games such as Terminator: Resistance and Star Wars: Battlefront.

How did you find your way to Unreal Engine specifically?
I had been using Unreal Engine since September last year doing previz for my live-action feature film Lunar, which was in soft prep while casting (due to COVID-19, production on that project is on hold), and I realized that the quality of the previz I was creating was very high with cinematic lighting. I thought with a bit of love and raytracing, this could end up being an animated film … but Lunar remains a 100% live-action movie.

There are other realtime engines out there, like Unity, which is great, but I had already been using Unreal Engine, so it made sense to push further with it. I also got some great support from the team at Epic in London to assist me pushing this angle I was going for.

The big thing with using Unreal Engine is the “what you see is what you get” approach to creating scenes and shots, and as a filmmaker that is very hands-on (control freak really!), this was a pure joy to be able to create shots and then hand the shots over to Andrea and Ronen to build further.

But the other big point I want to make is the fact that we removed all the various pipeline steps you usually get with CGI animation projects (rendering, compositing, etc.) because everything was being rendered in real time. So all I was doing was exporting ProRes 444 QuickTimes (Rec 709 color space) and in some cases EXR frames directly out of Unreal Engine and into editorial; that was it.

Any challenges or lessons learned from working in realtime?
The big challenge is adjusting the way to think about what shots and scenes are in a realtime environment. Traditionally in CGI you have each 3D file as a scene or shot, but in Unreal Engine you have one big world called “The Level,” which lives in the main project. Then inside each level are cinematic sequences you create that use all the assets that live in the content part of your project. Once you get your head around that, it’s so much fun and you realize it’s actually way faster working this way.

The other challenge was that everything was coming out of Unreal Engine with no compositing at all to cheat and fix things. This ensured our assets all worked well and we were being smart with shot constructions. One thing to note is the fact that all the shots were created entirely on a laptop — the Razer Studio. Andrea and Ronen used desktop PCs for their work and then sent their packaged Unreal Engine files and assets to me via Dropbox, which I then migrated into the project.

HaZ working on the Razer

The Razer laptop comes with an Nvidia Quadro RTX 5000, and it was literally like having a beast of a desktop machine in my laptop. This was super-helpful because back in early January I was travelling to various CG conferences giving talks and keynotes, and this allowed me to keep working away on the project in a variety of hotel rooms.

Raytracing took the project’s visual look to another level as we were getting reflections, shadows and lighting at such a cinematic level… all in realtime. It was kind of mind blowing at times to be scrubbing back and forth in a sequence in Unreal Engine with explosions going off, spaceships flying, robots firing weapons as I was moving my camera around — again, all in real time.

What other tools did you use in this workflow?
For the war zone sequence, I wanted to have a visceral and gritty tone to the camera moves. I also knew it would take a lot of keyframe animation to do this, so I used a virtual camera solution called DragonFly from Glassbox Technology. Phillipa Carrol, who I knew from The Foundry, reached out to me after seeing my early tests online and gave me a license to use along with some great support from her team. I was able to shoot the action scenes using my iPad as a virtual production camera while the warzone action scenes were playing in realtime.

Virtual camera

The exported shots from Unreal Engine were brought into Blackmagic DaVinci Resolve 16 for editorial and color grading.

Do you think this is the future of filmmaking, especially in the world of COVID-19? How do you see it helping getting production working again?
I think virtual production in general is going to play a big part in content being made for films and TV. And it’s going to be used more and more as the rendering quality of CG in real time is getting so photoreal (I have seen the recent Unreal Engine 5 demo, and wow!) and you can play that back on LED screens and capture actors all in camera.

From my end, it’s allowed me to develop and create the big, bold ideas with animated series content without having a big studio or huge teams — with the entire production done remotely. Even the additional voice recording we needed during editorial was done remotely, with me directing Nigel Barber via iMessage on the iPhone. He would then email me the Wav files and, boom, we have our character voiced in the edit.

Realtime technology also removes that common reason of having everyone under one roof for speed and efficiency in communication, because thanks to Zoom or Skype screen sharing, I can be directing artists as they do the changes instantly in Unreal Engine — without them needing to upload versions for me to review and annotate and send back. So those Zoom/Skype dailies sessions are actually production sessions because at the end of the call, all the changes have been implemented.

What’s next for you?
Battlesuit actually opened the doors for me as a filmmaker to tell stories using animation and broke down the various barriers and obstacles I had before when trying to get animated projects off the ground.

I recently signed on to direct an animated feature film based on a video game IP with producers in Hollywood. I can’t say much about it yetm but it’s using the same approach I did with Battlesuit (all in Unreal Engine). The details will be announced later this year.

You can watch the episode and the making of here:


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.