By Randi Altman
Director Hasraf “HaZ” Dulull is no stranger to making movies. Before jumping into writing and directing short sci-fi films, he was a visual effects supervisor and producer. His short film resume includes Project Kronos, I.R.I.S. and Sync. Recently, his first feature film, The Beyond, was released by Gravitas Ventures.
When I first met HaZ a few years back, we were both at an Adobe event — on a canal boat in Amsterdam during IBC. We started talking about visual effects, the industry and his drive to make movies.
This Brit is friendly, intelligent and incredibly hands-on in all aspects of what he does. His latest is The Beyond, which he describes as “a cerebral science-fiction feature film that blends the realism of documentary with the fantastical, ‘big idea’ nature of the science-fiction films of today.” The Beyond tells the story of a ground-breaking mission that sent astronauts — modified with advanced robotics — through a newly discovered wormhole known as the Void. When the mission returns unexpectedly, the space agency races to discover what the astronauts encountered on their first-of-its-kind interstellar space journey.
HaZ was so hands-on that he provided some of the film’s visual effects and edited the film. Here is the trailer. If you like what you see, the film is available for purchase or rent on most digital platforms.
When I reached out to HaZ to talk about The Beyond, he was in Vancouver working on an eight-part TV series for Disney called Fast Layne. “I directed episodes 1 and 2, and am currently directing episodes 7 and 8,” he says. “The beauty of starting and ending the series is it allowed me to set the show’s style and tone.”
It seems he can’t sit still! Let’s find out more about how he works and The Beyond…
Can you talk about prepro? How much of that included visual effects prepro?
Most people who know me will say I’m obsessed with prep. I had about six months of hardcore prep on this, from doing little storyboards, known as HaZ-Grams, right through to previs of the key sequences.
But even during the script-writing stage (six months before actual prep), I was coming up with visuals to support the ideas I was writing in the script. Sometimes I would knock up a test VFX scene just to see how complex it would be to create this idea I was writing in the script. Prep worked hand in hand with the script development and the budgeting of the film. The film was self-financed and later additional financing came in (during post production of the film), so I wanted to ensure everything was mapped out technically, as there was no “fix it in post” scenarios in this film — I wouldn’t allow it.
During location scouting, I would have my iPhone with me and shoot a bunch of footage and still imagery, so when I went back home I could write those locations into the script to make them work with the scenarios depicted in the film.
As part of prep we actually shot a test scene to really see if this mocku-mentary format would work to tell a grounded sci-fi story. This was also used to attract crew and other casting to the project, as well as get distributors primed early on.
Many shots from that test actually made it into the final movie —I wasn’t kidding about not wasting any budget or material on this production! So prep pretty much helped shape the script too, as I knew I wasn’t in the financial position to write stuff and then go and build it. I had to reverse engineer it in a way. In the film we have tons of locations, such as the Space Centre with actual real rockets. We also had a team in Iceland shooting alien landscapes, and we even shot some scenes in Malaysia to give the film a global feel — with each of those opportunities the script was tweaked to make full use of those location opportunities we had.
You shot with Blackmagic cameras. Was that your choice? The DP’s? Have you shot with these before?
From the start, I knew we were going to shoot on Blackmagic cameras. This was mainly down to the fact my DP Adam Batchelor — who had shot Sync with me and the proof of concept tests we did for this film — was a Blackmagic advocate and knew the cameras inside out, but more importantly he was able to get cinematic imagery using those cameras.
Blackmagic was very supportive of the film and have been of my career since my short films, so they came on as one of the executive producers on the film. No one had ever shot a full feature film using just the Blackmagic cameras. We also then used a Resolve pipeline to delivery. So The Beyond is the perfect case study for it.
Can you talk about that workflow? Any hiccups?
I think the only hiccups were the fact we were using a beta version of Resolve 14, so there were the expected crashes, etc. That would usually be seen as risky on a feature film, but luckily we didn’t have a distributor in place with a release date, so the risk was minimal.
The good thing was I would generate an error log report from Resolve and send it over to Blackmagic, who would then instantly send out a new patch. So we were looked after rather than being left on our own to scream at the monitor.
We stuck with a Pro Res 4444 QuickTime workflow for all material from footage to VFX renders, and enabled proxy on the fly within Resolve. This was great as it meant I was working with the highest-resolution imagery within Resolve, and it was fairly fast too. Things started to slow down when I had multiple layers of VFX and composites/groups, which I then had to render out as a new clip and bring back in.
How did you and the DP develop the look you wanted? Any scenes stick out that you guys worked on?
I was very fortunate to get Max Horton, who had worked on films like Gravity, to come onboard to grade this film at the Dolby Vision lab in London’s Soho. We also did an HDR version of the film, which I think is the first indie film to have an HDR treatment done to it.
We had three to four days of grading with Max, and I was in the room with him the whole time. This was because I had already done a first-pass temp grade myself while editing the film in the beta version of Resolve 14. This made the workflow as simple as exporting my Resolve file and then the material hand-over to Max, who would load up the Resolve file, link up the material and work from there.
Max kept everything photographically like a documentary but with a slight cinematic flair to it. The big challenge was matching all the various sources of material from the various Blackmagic cameras (Ursa Mini Pro, the Production Camera and the Pocket Camera) to the DJI Osmo, drone footage and stock footage.
How many VFX shots were there? Who did them?
There were around 750 visual effects shots. I designed all the VFX scenes and handled a huge portion of the compositing myself, including invisible effects shots, all the space scenes, alien planet scenes, memory scenes and tons more — this would not have been possible without the support of my VFX team who worked on their assigned sequences and shots and also generated tons of CGI assets for me to use to create my shots in comp.
My VFX team members included my long-time collaborator John Sellings, who was the VFX supervisor for all the Human 2.0 sequences. Filmmore, in Amsterdam and Brussels, handled Human 2.0 scenes in the transcode bay with in-house VFX supervisor Hans Van Helden. London’s Squint VFX handled the Human 2.0 scenes in wake-up lab. Charles Wilcocks was the Human 2.0 CG supervisor who worked on the shape and look of the Human 2.0.
Hussin Khan looked after the Malaysian team, which provided rotoscoping support and basic comps. Dan Newlands was our on-set tracking supervisor. He ensured all data was captured correctly and supervised anything tracking related in the Human 2.0 scenes.
Another long-time collaborator was Andrea Tedeschi, who handled the CG and comps for the spacecraft carrier at the end of the film, as well as rendering out the CG astronaut passes. Rhys Griffith handled the rigging for the Human 2.0 characters in Maya, and also looked after the CG passes for the alpha Human 2.0 scenes using Blender. Aleksandr Uusmees provided all the particles and simulation rendered out of Houdini as CG passes/elements, which I then used to create the wormhole effects, alien spheres and other shots that needed those elements.
JM Blay designed and created the standalone motion graphics sequences to visualize the Human 2.0 medical procedure, as well as mission trajectory graphics. He also created several “kit-bash” graphics assets for me to use, including UI graphics, from his After Effects files.
Territory Studio created the awesome end titles and credits sequence, which you can read more about on their site.
As a VFX pro yourself, do you find that you are harder to please because it’s your wheelhouse?
Oh boy. Ask any of the VFX guys on the team and they will say I am a beast to work with because I am hands-on, and also I know how long things take. But on the flip side that had its advantages, as they knew they were not going to get revision after revision, because with each brief I also presented a proposed methodology, and made sure we locked down on that first before proceeding with the shots.
Was this your biggest directing job to date? Can you talk about any surprises?
It wasn’t my biggest directing job to date, as during post production of The Beyond my second sci-fi film Origin Unknown (starring Katee Sackhoff from Battlestar Galactica, The Flash) was green-lit and that had its own set of challenges. We can talk more about that when the film is released theatrically and VOD later this year via Kew Media.
This was, however, my biggest producing job to date; there were so many logistics and resources to manage whilst directing too. The cool thing about the way we made this film was that most of the crew were on my short films, including some of the key cast too, so we embraced the guerrilla nature of the production and focused on maximizing our resources to the fullest within the time and budget constraints.
What did you learn on this film that will help on your next?
The other hat I was wearing was the producer hat, and one thing I had to embrace was the sheer amount of paperwork! I may have taken the same filmmaking approach as I did on my short films — guerrilla and thinking outside the box technically and creatively— but making a commercial feature film, I had to learn to deal with things like clearances, E&O (errors and omission) insurance, chain of title, script report and a whole bunch of paperwork required before a distributor will pick up your film.
Thankfully my co-producer Paula Crickard, who is currently wrapping post on Terry Gilliam’s Don Quixote, came in during the post stage of the film and helped.
The other thing I learned was the whole sales angle — getting a reputable distributor on board to sell the film in all worldwide territories and how to navigate that process with rights and IP and more contracts etc. The advise I got from other filmmakers is getting the right distributor is a big part in how your film will be released, and to me it was important the distributor was into the film and not just the trailer, but also what their marketing and sales strategy were. The Beyond was never designed to be a theatrical film and therefore I wanted someone that had a big reach in the VOD world through their brand, especially since The Beyond doesn’t have big-name actors in there.
What was the most challenging scene or scenes? Why and how did you overcome those challenges?
The Human 2.0 scenes were the most challenging because they had to look photoreal due to it being a documentary narrative. We did first try and do it all in-camera using a built suit, but it wasn’t achieving the look we wanted and the actors would feel uncomfortable with it, and also to do it properly with practical would cost a fortune. So we went with a full-digital solution for the Human 2.0 bodies, by having the actors wear a tight grey suit with tracking markers on and we restricted our camera moves for simplicity to enable object tracking to work as accurately as possible. We also shot multiple reference footage from all angles to help with match moving. Having an on set-tracking supervisor helped massively and allowed us to make this happen within the budget, while looking and feeling real.
Our biggest issue came when our actress made very tiny movements due to breathing in close-up shots. Because our Human 2.0 was human consciousness in a synthetic shell, breathing didn’t make sense and we began making up for it by freezing the image or doing some stabilization, which ended up being nearly impossible for the very close-up shots.
In the end, I had to think outside the box, so I wrote a few lines into the script that explained that the Human 2.0 was breathing to make it psychologically more acceptable to other humans. Those two lines saved us weeks and possibly months of time.
Being a VFX movie you would expect us to use a form of greenscreen or bluescreen, but we didn’t — in fact, the only stage used was for the “white room” astronaut scene, which was shot over at Asylum FX in London. There was an actor wearing an astronaut suit in a bright photography room, and we used brightly exposed lighting to give a surreal feeling. We used VFX to augment it.
As a writer and a director, how was it seeing your vision through from start to finish.
It didn’t really hit me until I watched the press screening of it at the Dolby Vision office in Soho. It had the fully mixed sound and the completed grade. I remember looking across at my DP and other team members thinking, “Whoa! It looks and feels like a feature film, and we did that in a year!”
You edited the film yourself?
Yes, I was the editor on the film! I shoot for the edit. I started off using Adobe Premiere CC for the early offline and then quickly moved over to Resolve 14, where I did the majority of the editing. It was great because I was doing a lot of online editorial tasks like stabilizing, basic VFX, pan and scans, as well as establishing temp looks while editing. So in a way there was no offline and online editorial, as it was all part of one workflow. We did all our deliverables out of Resolve 14, too.