NBCUni 9.5.23

The-Artery embraces a VR workflow for Mercedes spots

The-Artery founder and director Vico Sharabani recently brought together an elite group of creative artists and skilled technologists to create a cross-continental VR production pipeline for Mercedes-Benz’s Masters tournament brand campaign called “What Makes Us.”

Emmy-nominated cinematographer Paul Cameron (Westworld) and VFX supervisor Rob Moggach co-directed the project, which features a series six of intense broadcast commercials — including two fully CGI spots that were “shot” in a completely virtual world.

The agency and The-Artery team, including Vico Sharabani (third from the right).

This pair of 30-second commercials, First and Can’t, are the first to be created using a novel, realtime collaborative VR software application called Nu Design with Atom View technology. While in Los Angeles, Cameron worked within a virtual world, choosing camera bodies and lenses inside the space that allowed him to “shoot” for POV and angles that would have taken weeks to complete in the real world.

The software enabled him to grab and move the camera while all artistic camera direction was recorded virtually and used for final renders. This allowed both Sharabani, who was in NYC, and Moggach, who was in Toronto, to interact live and in realtime as if they were standing together on a physical set.

We reached out to Sharabani, Cameron and Moggach for details on VR workflow, and how they see the technology impacting production and creativity.

How did you come to know about Nurulize and the Nu Design Atom View technology?
Vico Sharabani: Scott Metzger, co-founder of Nurulize, is a long-time friend, colleague and collaborator. We have all been supporting each other’s careers and initiatives, so as soon as the alpha version of Nu Design was operational, we jumped on the opportunity of deploying it in real production.

How does the ability to shoot in VR change the production paradigm moving forward?
Rob Moggach: From scout to pre-light to shoot, through to dailies and editorial, it allows us to collaborate on digital productions in a traditional filmmaking process with established roles and procedures that are known to work.

Instead of locking animated productions into a rigid board, previs, animation workflow, a director can make decisions on editorial and find unexpected moments in the capture that wouldn’t necessarily be boarded and animated otherwise. Being able to do all of this without geographical restriction and still feel like you’re together in the same room is remarkable.

What types of projects are ideal for this new production pipeline?
Sharabani: The really beautiful thing for The-Artery, as a first time user of this technology, is to prove that this workflow can be used by companies like us on every project, and not only in films by Steven Spielberg and James Cameron. The obvious ideal fit is for projects like fully CGI productions; previs of big CGI environments that need to be considered in photography; virtual previs of scouted locations in remote or dangerous locations; blocking of digital sets in pre-existing greenscreen or partially built stages; and multiple remote creative teams that need to share a vision and input

What are the specific benefits?
Moggach: With a virtual pipeline, we are able to…
1) Work much faster than traditional previs to quickly capture multiple camera setups.
2) Visualize environments and CGI with a camera in-hand to find shots you didn’t know were there on screen.
3) Interact closely regardless of location and truly feel together in the same place.
4) Use known filmmaking processes, allowing us to capitalize on established wisdom and experience.

What impacts will it have to creativity?
Paul Cameron: For me, the VR workflow added a great impact to the overall creative approach for both commercials. It enabled me to go into the environment and literally grab a camera, move around the car, be in the middle of the car, pull the camera over the car. Basically, it allowed me to put the camera in places I always wanted to put the camera, but it would take hours to get cranes or scaffold for different positions.

The other fascinating thing is that you are able to scale the set up and down. For instance, I was able to scale the car down to 25% its normal size and make a very drastic camera move over the car, handheld with a VR camera, and with the combination of slowing it down, and smoothing it down a bit, we were able to design camera moves that were very organic and very natural.

I think it also allowed me to achieve a greater understanding of the set size and space, the geometry of the set and the relationship of the car to the set. In the past, it would be a process of going through a wireframe, waiting for the rendering — in this case, the car — and programming camera moves. It basically helps with conceptualization of camera moves and shot design in a new way for me.

Also being a director of photography, it is very empowering to be able to grab the camera literally with a controller and move through that space. Again, it just takes a matter of seconds to make very dramatic camera moves, whereas even on set it could take upwards of an hour or two to move a technocrane and actually get a feel for that shot, so it is very empowering overall.

What does it now allow directors to achieve?
Cameron: One of the better features about the VR workflow is that you can actually just teleport yourself around the set while you are inside of it. So, basically, you picture yourself inside this set, and with a left hand controller and one for the right hand, you have the ability to kind of teleport yourself to different perspectives. In this case, the automobile, the geometry and wireframe geometry of the set, so it gives you a very good idea of the perspectives from different angles and you can move around really quickly.

The other thing that I found fascinating was that not only can you move around this set, in this case, I was able to fly… upwards of about 150 feet and look down on the set. This was, while you are immersed in the VR world, quite intoxicating. You are literally flying and hovering above the set, and it kind of feels like you are standing on a beam with no room to move forward or backward without falling.

Paul Cameron

So the ability to move around in an endless set perspective-wise and teleport yourself around and above the set looking down, was amazing. In the case of the Can’t commercial, I was able to teleport on the other side of the wind turbine and look back at the automobile.

Although we had the 3D CADs of sets in the past, and we were able to travel around and look at camera positions, somehow the immediacy and the power of being in the VR environment with the two controllers was quite powerful. I think for one of the sessions I had the glasses on for almost four hours straight. We recorded multiple camera moves, and everybody was quite shocked that I was in the environment for that long. But for me, it was like being on a set, almost like a pre-pre-light or something, where I was able to have my space as a director and move around and get to see my angles and design my shots.

What other tools did you use?
Sharabani: Houdini for CG,Redshift (with support of GridMarkets) for rendering, Nuke for compositing, Flame for finishing, Resolve for color grading and Premiere for editing.


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.