NBCUni 9.5.23

S4 Studios and Original Syndicate: VP Workflows

By Alyssa Heater

The workflow for a virtual production is not one-size fits all; the scope of the project truly dictates how the technology can be used and applied. postPerspective met with two innovators from the in-camera VFX (ICVFX) world to learn more about workflows for various types of projects, helping to demystify the notion that this growing technology can only be used for big-budget blockbusters and series.

S4 Studios founder Geoffrey Kater provided insight into the workflow for music videos using the example of Daughtry’s latest video, Artificial, for which Kater developed the concept and art in addition to overseeing the shoot and delivery.

Original Syndicate principal Ben C. Roth provided an overview of his company’s virtual production work in the live events space, detailing the process of builds behind arena concerts spanning Aerosmith, Taylor Swift and Bad Bunny. He also details Original Syndicate’s work exploring the educational implications of this technology.

S4 Studios

I would love to learn how you got into the virtual production space and, ultimately, what led you to starting S4 Studios?
I studied transportation design at Art Center College of Design, then went straight to animation as a prop designer for several years. I started to learn VFX, then I founded S4 Studios in 1999, focusing on motion graphics and VFX for film, television, music videos, promos, trailers, and everything else. Currently we mix visual FX with live-action for a complete virtual production pipeline.

It was around 2019 that I began playing around with Unreal Engine and thinking about it in terms of our pipeline. It was pretty new at that point and wasn’t quite ready for wide use within the film and TV industry. Only a few projects were really pulling it off – case in point: The Mandalorian. But because I was starting to see the results, I became excited about the prospect of real-time compositing so you don’t have to have a compositing schedule in post if everything goes according to plan.

We already knew the VFX side, we just needed to understand the game engine side. We had an opportunity to build out a small virtual production stage with an LED company, and we collaborated with them for a year working on multiple different projects. We were in the middle of virtual production R&D when COVID happened, and all of a sudden, everyone started working virtually. Since nobody was coming into the office anymore, I scaled down our Wilshire facility and kept all my talent virtually.

We needed a stage so that we could really start to build upon the research and work that we were doing in the virtual production space. I found a stage in Canoga Park with an LED wall, but there really wasn’t anyone pursuing virtual production full time. So I offered to help build out the technology as well as build a business around it. The stage owners agreed, and we worked out a deal where we’d go there on a daily basis and do R&D as well as work on any virtual productions that used the LED wall.

The clientele that we were attracting on our smaller wall couldn’t afford a $50,000-a-day virtual production stage but could handle $7,500 to $10,000 per day, which would help with a major cost savings in post. If we did it right, then they could be in and out and then editing the next day. That’s huge for a small production, especially for commercials. I really started to position us as what I call a “virtual insert stage.” Virtual insert stages, historically, are small. They’re for shots of people talking in a living room or doing a smaller insert type of shot, not big, wide cinematics that a larger wall would be good for.

We started to craft our narrative around that, and it started to take off. If done right, this idea of being able to shoot multiple locations in a single day using a smaller set could get us way down the road and create a tremendous amount of value. Over the course of the year, we started doing music videos and commercials, then one of our music video clients came to us to help with the Daughtry Artificial music video. The director wanted to do something high-concept and high-end that would feel futuristic and apocalyptic. And we were prepared for that scenario, which we wouldn’t have been a year earlier because we didn’t have our workflow solidified.

The Daughtry video really helped accelerate the business because we were able to take the video to another level. And with that success, the owners of our Canoga Park stage decided to move to a bigger facility in Burbank, expand our model and build upon the success that we’d had. We are now building multiple stages bigger and with more services.

Do you have an in-house virtual art department, or do you partner with another company?
Yes, S4 does it all in-house, from VFX to Unreal Engine work. For the Daughtry music video, I actually built the environment. My process is to sit down and model stuff out and set it all up using a few custom assets and marketplace assets. I was able to build out the environment, light it and get it approved by the director, and then we Unreal-engineered it, ingested it and set it up with nDisplay, which is a plugin that projects the content onto the wall properly. We also used the Vive Mars tracking system. Our crew was three people: me as the VP supervisor, our Unreal operator, and an audio/video engineer for additional video and music.

In addition to Unreal Engine development, we do visual effects in animation, and we’ve figured out the pipeline for that. When a client comes in and wants to do XYZ, we have a good understanding of how to create the environment, get them to weigh in on it and then prep it and get it up on the wall.

One of the biggest challenges that virtual production teams deal with is clients coming in thinking that the workflow is going to be like what they’re used to, and it’s not. On smaller productions, because budgets and timelines are tighter, they’re used to flying in and shooting immediately. We typically spend the prep day getting them comfortable with the virtual world, the movement of the camera and what the engine is displaying on the wall. DPs who have never shot in a virtual environment might think they are just going to point at the wall and it’s all going to work, but there are tweaks that we have to make so it feels right.

Second, there’s the integration of any foreground elements into the virtual world and the color balance between the two. Additionally, when I send out videos for review before we start shooting, it’s important to understand that what you are looking at on your computer is not what it’s going to look like on the wall. Clients must be prepared to come in and make adjustments, and we are here to help educate them on what adjustments can be made. For now, the most common thing with clients in the virtual production space is that there will always be a learning curve for them to see the challenges and benefits, but in the end, they are usually very happy with the results.

My only other thing to emphasize would be that virtual production is an extremely collaborative type of production. For the DP, the virtual production group is extremely important to the success of the production because the DP is relying on that team to deliver an image they can shoot. If they go out and shoot a park scene on-location, then it’s just them and the park. If we’re producing the park, then they are relying on us. I’ve had some really great relationships with DPs who have total respect for that. It has to be supercollaborative, or it’ll fail.

Tell us about the volume used for the Daughtry Artificial shoot — the cameras, lenses, and any key technology that you used?
On the LED wall side, we used 2.6mm Unilumin LED tiles. Everything is HDR, so the blacks are very dark and rich. An important thing to look for when you go to any stage is whether HDR is running on the walls because if not, you’ll never get the black levels to match your foreground. Our LED walls comprise one bigger wall that is about 30 feet wide by 13.25 feet tall. Additionally, we have two “fly-in” walls that are on steel deck so we can wheel them around.

There was a set designer who outfitted the volume with props and put stone pillars in the corners of the volume to hide the LED wall seams. This allowed us not only to shoot straight on but also to pan around the band and catch the sides. Doing this on a smaller volume was a bit more challenging in regards to the types of shots, but because this was a music video and stylized, we were able to get away with a lot. The camera was an Alexa Mini on a Movi Pro. The key challenge with cameras on any LED wall is that you can get what looks almost like a rolling shutter or a banding effect because of the shutter mismatch with the refresh rate on the wall. Filmmakers often come in wanting to use their cameras and lenses, and then they encounter this banding effect, which really ruins the look, so they obviously get frustrated. That’s one of the things on which I provide guidance. Global shutter helps mitigate this problem.

The tracking was the HTC Mars CamTrack system. A tracker goes on the camera, and then sensors up in the grid track it. You have to be really careful not to obscure the sensors too much because if they lose sight of the tracker, then the tracking can get lost for a second. However it worked really great, and we were pleased with the Mars performance. I also think with virtual production, the hand-held look always looks so amazing, especially on a music video. They’re just going totally nuts; there’s flashing lights and water on the stage for reflection and all kinds of stuff going on. We also had a fog machine. You must have a fog machine at every music video shoot.

We used Unreal Engine 5.1, Nvidia RTX Quadro 8000 cards synced from three PCs and Novastar 4K processors for the LED wall.

For this particular project, what happened after the shoot was over? What were the deliverables and who had to sign off on them? Did anything really have to happen in post?
We had planned for some set extensions. We shot the band on the LED wall, but we were distanced quite a bit from the wall. The LED wall served as the backdrop for the band, and we did a composite using After Effects to complete the look. We also did a full take with Chris [Daughtry] on bluescreen for alternate composites used in the video. We then did some additional effects, such as adding a cyborg face to Chris. To do this, we put dots on his face then tracked them in 3D and applied the render and composite, so pretty straightforward.

As far as signing off, that was a thing of beauty. At the shoot, the label and the producers were there, and they were all superexcited by what they were seeing live. I loved the efficiency of the process because the director and editor were editing the very next day with a close-to-final look before color correction. No long post production composite cycle.

Can you share what’s on the horizon for S4 Studios?
Yes. We moved to a new location in Burbank, which allows us to be closer to other productions. The building is bigger, we have tons of parking and we have a 6,000-square-foot stage that can accommodate a large wall. We can now serve midlevel shows and small features that can stay on the stage for weeks, which we couldn’t do in our previous facility.

Original Syndicate

Tell us about Original Syndicate and the services you provide for virtual production.
We offer what we call “studio innovation” services – meaning that we provide technical expertise, 3D design, and integration services to tackle increasingly complex studio related demands. As such, we often sit amongst AV integrators, host studios, and creative developers, focused on designing, building, commissioning (and sometimes operating) virtual production volumes and other integrated studio solutions.

Ben C. Roth

At our core is a high-efficiency real-time workflow, which informs nearly everything we do – from virtual production, across live events and broadcast capabilities. After 26 years as an executive in the agency space, I increasingly saw technology creeping into our world, but at the time, we weren’t doing a great job of synthesizing it into our processes of designing and developing brand experiences.

In 2019, my partner, Steve Richards, and I saw a major opportunity to integrate our real-time game engine-based workflow into our design development processes for architecture, live events, concerts, and touring, and other types of experiential installations. It was important that we were able to realize efficiencies and increase speed to market by moving away from the typically linear industry process of design-render-present-take feedback, then repeat – re-design-re-render, etc. as weeks and weeks go by and clients see diminishing value.

Our workflow generally starts by creating a detailed and highly accurate 3D space, then we’ll often bring that space into Unreal Engine for real-time collaboration with decision-makers. This pulls everyone into a common, hyperreal workspace where clients, designers, and other stakeholders can make informed decisions in the moment across broad areas of the experience, as well as into the finest of technical, engineering, and logistical details. Once that collaboration is completed, we’re done – there’s no other delay or gratuitous back and forth required.

We then use those files as to-scale reality-based designs for permitting, design development documents, construction documents, equipment specs and performance details, bills of materials, etc. This helps expedite the entire project timeline anywhere in the world by streamlining subsequent workflows, reducing onsite time, and managing all sizes of projects, whether a “green field” buildout, or an incremental integration and installation project.

In 2019, we piloted this process for a huge project in China for Bloomberg, which helped save so much time and development cost that we were able to translate over 20% of the original budget into increased material value for the client. We were even able to develop sophisticated signage systems that integrated data from the Bloomberg Terminal, which was incredible. They were just blown away. We were also able to use the pre-vis process to engage the executives. They put on VR goggles, we walked them through the virtual space to see every aspect of the design in real-time, then they provided their feedback for immediate augmentation.

From that success, we were eager to expand Original Syndicate services more robustly across live events and broadcast. When COVID happened, our live event business was obviously shelved, so we pushed heavily into remote broadcast solutions and virtual events. At that time, we had been working with Disguise and Marcus Bengtsson, who is now our CTO and head of virtual production and were looking at ways Disguise could be used in China. We were discussing the next wave of virtual production as it became a sort of gold rush that everyone talks about.

Equipment companies were building stages left and right, but fast forward, and a lot of productions had bad experiences on them. They didn’t understand the technology. And getting out of the post-production mindset, pushing it into pre-production, they didn’t have the experience to know where critical pitfalls were going to be. We were fortunate to have started translating some of our event work into this virtual production-based workflow. We also had a history of working in design for live touring events for AC/DC, Taylor Swift, and Aerosmith, which gave us the credibility to have authoritative conversations around how to do things better.

Elaborate on about your work in the live event space. How do you approach a live event from a virtual production workflow perspective?

Our head of design, Chris Nyfield, has just finished working on Romeo Santos’ set design, and Aerosmith’s Peace Out Tour design and is now working with New Kids on the Block, Travis Shirley, and Bad Bunny. By using the real-time workflow we enable stage and production designs to pre-visualize in a common, interactive 3D space, where we can show the talent how the space will look, how it’s built and functions with incredible real-world accuracy. We add virtual characters so they can see how they would look, move, and where lighting will hit.

We also built and launched NYC’s largest virtual production volume at Pier 59 Studios in this past year and have started to move more steadily into building virtual production stages, helping clients get to a point of stability quicker and avoiding unnecessary risks. We work closely with the equipment providers, architects, power, and HVAC parties to optimize their equipment and performance requirements based on each space.

Chris also led design on the Tupac experience Wake Me When I’m Free, which implemented the live events workflow to handle the enormous pressure on the build aspect of it. You have to take into consideration what is being built across practical elements, integrated technologies, dynamic artwork, and even the venue constraints — is it an existing venue, new construction, temporary? When you think about the value of using a real time 3D workflow in a live event, it’s bringing all those things together in a real world setting that helps anticipate and reveal the good, bad, and ugly early in the process – allowing for rapid collaboration and solution-making. While it’s not an architectural software, everything from the LED walls, the light fixtures, 2x4s and steel is built as an actual element to scale in the virtual space. We can assess what it takes to build everything, its power requirements, how something is rigged, all the way through the front-end audience experience and really get an idea on what it’s going to take — the equipment list, the timeframe, and the cost.

We’re doing this not just as a design and visual exercise, but as a full 360-degree build-out assessment. And for a project like Aerosmith, we’re building massive flying LED walls and lighting grids, stage design, deck design, back of house, and everything in between. There are a lot of moving parts that must come together, but once we turn it on in the virtual space and we’re adjusting lighting and motion tracking and we’re running content through the LED, you can zoom in, zoom out, and start to see the full synchronization.

I’ve heard that many people have this misconception that virtual production can only be used on big-budget projects, when it can actually be really beneficial for smaller projects as well.
Yes. We’re trying to increase understanding and expertise in the space by engaging with the academic side of virtual production – the industry needs it. Virtual production is intimidating no matter how big or small because it’s comprised of a complex array of individual technologies that have to seamlessly come together and work flawlessly. It’s using existing technologies in new ways. It can become very confusing if you don’t already have a solid understanding of what you’re trying to achieve, which is essentially to shoot higher quality material in a more efficient way.

I think Original Syndicate’s niche is that we operate in an end-to-end, white glove way. No matter how far upstream you are creatively and conceptually, we can help. Our people have come in and helped correct issues on major productions because we have comprehensive relationships and expertise with the LED providers, the rendering technologies, the tracking providers, and we can help simplify the whole process. We focus on performance objectives, reducing variables and trying to help everyone understand what to do and what not to do. Then as you progress into production, we ensure that you have right people, have the right expertise in the right seats. We’re along for the entire process through when it gets handed off to the stage operating team and ultimately the post-production team. We come in and make sure people have a great experience.

What we’ve been doing for the past year and a half is working with universities, studios, and other groups to help them build out and optimize their stages. We have offices in Australia, in addition to LA and New York, and we are working with Screen Victoria and Deacon University to develop their curriculum and to augment their virtual production capabilities in Melbourne. Coming from the studio innovation space, we are helping to demystify people’s perception of virtual production, getting them to a point of stability, and putting a clear action plan in place for them to realize and act on their opportunities.

We are determined to galvanize virtual production implementation and make it more accessible for all kinds of productions. We all know that Marvel/Disney, Amazon, and Netflix are shooting in virtual production volumes, but not everyone has the funds or the need to shoot at that scale. They don’t need a 25,000 square foot volume to shoot months on end for a new series, they just need the efficiency and flexibility to be anywhere, any place, at any time of day that a virtual production stage can offer.

What do you feel needs to happen for virtual production to be more widely adopted as a tool for features, television and beyond?
First, we have to help producers have positive experiences using VP volumes by better understanding what virtual production can and cannot do – and how to eliminate surprises when working in the medium. There is still a fear factor around virtual production that we hope to dispel by providing rock-solid systems architecture, stable operations, and high-quality output.

Secondly, virtual production technologies, as they are today, are still in a development phase. Much of the work done to date has been the result of really smart technical people understanding how to pull off incredible in-camera visual experiences using existing and customized tech. But as the virtual production discipline, technology, and processes mature, there’s no doubt in our minds that it will increase in adoption for all sort of content creation needs across television, film, social, and corporate productions.

And one other consideration is the requisite shift from post-production mentality – as in “fix it in post” – to a “plan it in pre” mindset. Since virtual production intends to blend VFX into your productions in-camera in real-time, it’s important to lock everything down in pre-production because once you come into the environment and you’re about to shoot, everything’s got to be ready to roll.

For now this can be a significant and disconcerting adjustment. But once that shift is made, the experiences tend to be pretty thrilling. Once you realize the latitude you have to create worlds and see your VFX in real-time, it becomes liberating. You realize that you’re operating at such a different speed with different types of controls at your fingertips and you’re seeing production in a new light. When The Mandalorian and Game of Thrones started, they thought they would just use this process for special effects or specialized scenes. Then they realized that the control was so massive that they could do much more.

Can you share anything about what is on the horizon for Original Syndicate?
We are working on a variety of virtual production and broadcast opportunities around the world and in every time zone across the U.S. – and we’re eager to support the industry growth however we can. We’re working with Universities in North America and Australia and see huge opportunities for increased academic growth and authority in virtual production. And we’re supporting industry associations in areas of color and lighting, standardization, and education.

Our goal continues to be to help to ensure that any new and updated studio builds are thought through regarding technology and clearly defined performance objectives. For the mid-market brands and agencies, we want to create more democratized access to virtual production with our own Original Syndicate stages, where we can help them accelerate innovation, realize new efficiencies, and allow more great work to get done with higher quality results.


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.