By Beth Marchant
Irving Harvey founder Samuel Gursky and Final Pixel CEO Michael McKenna know that the way their facilities store, manage and share assets across local, shared and cloud networks is changing before their eyes. For Gursky, whose small post facility will soon move to a new location, this means building robust and flexible databases to track assets. For Final Pixel, a new virtual production company, it’s a more complex equation involving photoreal LED projections built from distributed assets and running live on set.
Here, we look at how evolving shared storage workflows help keep their filmmaking operations running smoothly.
Irving Harvey
These days, it’s common for data wranglers, especially those native to digital workflows, to move easily from creating looks live on set to finishing in the grading suite. That’s the path Irving Harvey founder Samuel Gursky found himself on nine years ago. “I was doing mostly data management on set and was working closely with a lot of DPs, especially those using Red cameras,” he says. “I was setting looks and even owned some Red transcoding equipment, some of the first Sonnet Thunderbolt expansion chassis. I started to do a little bit of post color, eventually got a studio and started working with my business partner, Matt Greenberg.”
With a majority of independent film clients, the Tribeca-based post facility also grades, finishes and consults on a wide variety of shorter-form content, including commercials, music videos and fashion shoots. “Often it’s a mixture of grading and consulting, and sometimes it’s just advice,” says Gursky, who works from one of the facility’s three client-ready edit and color suites. “But in general, we find that what we do best is fill in those gaps in people’s workflows. These can be project-specific or the result of a system we’ve set up in house. We’ll set up systems for working safely and securely, but also at a reasonable pace on-site.”
Gursky says the heart of any workflow begins with a robust database to track and share information pertinent to the process, especially the network-attached storage (NAS) or data management infrastructure at any facility’s core. He routinely builds these workflow databases for all of the services Irving Harvey offers, but he also creates handy user interface tools and how-to guides that help his client filmmakers make efficient use of all that data.
“The biggest challenge that I personally have found is a unification of accessible deliverables,” he says. “The end goal is to coordinate all of these various moving pieces into a robust, efficient, full package of deliverables that saves resources but also gives us a higher level of accessibility to its components throughout the life of a film.”
As cloud-driven data sharing and collaboration become more prevalent, Gursky says more and more clients are testing the waters. “The most recent thing that’s been popping up a lot has been people in different locations wanting to stay in sync for the edit — usually three people who want to work on it that are all in different places.”
The cloud, he adds, “has been something we’ve always been interested in because a lot of what we do is so driven by the infrastructure that we have on-site. We’ve always been interested in trying to figure out how to have a secondary work site in a way that is fast and secure and that doesn’t compromise our ability to keep everything organized the way that we would on-site.”
The innovation that grew out of the pandemic has also helped move the needle, he says. “We’ve been researching it and are now helping people use available remote services just out of necessity,” he says. “It’s a real advantage to actually have object-based storage that people can work from. As long as you’re following a very short list of criteria, you can really make it work for you within a relatively low budget. It’s actually come further than I ever expected it would, and considering what it used to cost to rent rooms and all that went with it, it has solved a bunch of problems at once.”
Irving Harvey’s in-house setup, which lets multiple clients pull from high-quality online media at the highest resolution, is “still a ways away” from a more cloud-centric design. Most jobs, he says, are “very location-driven, and the data logistics only add another layer to that. It’s something that I think, with time, will get even more flexible. Right now, there’s a premium on fast storage, but offline storage is very reasonable with cloud options now.” The studio uses LucidLink to manage projects across the NAS. “It’s what [Hedge’s] Postlab [collaborative editing software] uses,” says Gursky. “They have that whole project management on top of this storage, but there’s the slower tier that pulls from cheaper web storage, like Wasabi, instead of Aspera.”
For offline jobs, he says, the once common practice of having multiple clients come to Irving Harvey and work from its servers “is definitely getting replaced by the cloud. The finishing, though, is still mostly local, or we’re dealing with file transfers for people to work locally on their systems elsewhere. We’re using the cloud more, but not necessarily shifting to it as a whole just yet.”
Security is a growing concern for clients as more sensitive data moves online. Gursky has recently added a systems admin familiar with TrueNAS open-source storage, which they now run on all of the facility’s servers, to help the facility optimize every layer of connected data. “This admin is really cybersecurity-focused, and that’s helped us not just figure out how to prevent theft and back up in case of failure but create a plan for how we will protect and access the media moving forward,” Gursky says. “As we’ve grown and started to work on larger-scale entertainment projects, we have been gearing our setup more toward getting Trusted Partner Network-certified,” he adds. “As part of that, we’ve had to really reassess a lot of our physical and digital security.” It’s been a long process, he says, all part of preparing for a new facility they’ll move into by the end of the year.
The pandemic also forced Gursky and his team to become comfortable “changing our plans on a dime,” he says. “Through it all we’ve kept it all continuously flowing, and I’ve been really happy and proud of that. It’s a nice reminder of how things have changed in a positive way.”
Final Pixel
Data management during even the most straightforward of shoots and grading processes is tough enough. Add the LED volumes of virtual production to the equation, and you have a whole new set of potential pain points to worry about.
Michael McKenna, CEO of 1-year-old virtual production studio Final Pixel, remembers the exact moment he and his brother Chris McKenna, a creative agency director, decided to open their own virtual production studio. “We were both obsessed with The Mandalorian, and during a family Zoom call last year, I said, ‘Watch this LED wall virtual production. Isn’t it fascinating?’ We started chatting and thought, ‘This is crazy. We should look into this.’”
Chris McKenna and Monica Hinden, Michael McKenna’s business partners, have been running the LA-based creative agency Wee Beastie for 15 years. “Chris is a director, Monica an EP. They’d had shows canceled during the pandemic and thought that virtual productions were a way around that,” he adds. “It really grabbed me because of the way it completely changes the filmmaking process.
“For starters, being able to have live-action VFX, where everything is in-camera again and not all done in post, is an amazing advantage,” continues Michael McKenna. “Add to that the lighting benefits for a DP, who can now match lighting to a scene on the day, and the environmental benefits, since you don’t have to fly everybody all over creation anymore. You can shoot four locations in a single day! It’s a game changer. And it’s built on [Epic Games] Unreal Engine. As someone who grew up gaming, it just made sense to me. It immediately clicked for both of us how we could use this new technology to create a viable business model for clients.”
Since opening up shop, Final Pixel has ramped up with a slate of projects for clients, including ABC, BBC, Hulu, Discovery+, Disney and Marvel Studios. VFX supervisor Steve Hubbard, who was part of Life of Pi’s Oscar-winning VFX team, is also on staff. “We’ve managed to recreate environments, not only like the surface of Mars or a period location, but environments that our clients actually want to shoot on: photoreal, familiar sets like a living room or a great room in a manor,” says Michael McKenna. “The potential for our clients to save money, time and effort, and also improve their creative scope, is huge. So we set off on a journey into photorealism and asked, ‘How can we create more and more real-world places or places that feel real?’”
These are not easy shoots to do, by any means, Michael McKenna says. “They definitely require a kind of discipline to them and, in terms of workflow, a hierarchy to how things are done. What we discovered quite quickly, having not really been from a game-development background, was the benefits of using something like source control. Obviously, we knew about it and heard about it, but we had never really used it.”
After their first demo, they spent most of their time sending footage around in zip files to get them in the hands of artists. “We’ve got artists all over the world,” he says. “They’re working in different time zones. We quickly realized that this is a horrendous way to manage content for a global company.”
As a global studio, flexibility and time are the operative words. Reducing administrative tasks is crucial to staying on schedule and on budget. Perforce, a version control system, is the backbone of the studio’s project distribution procedures. “Virtual production studios spend a large chunk of their lives in video game engines like Unreal, where projects are full of complex file structures and moving parts,” says McKenna. “They are, after all, essentially video games. Working on these kinds of projects across multiple teams in different locations simultaneously isn’t possible without version control like Perforce, which catalogs and tracks every change to every file made by every person. We have total control over our projects and the ability to roll back any individual version or change in seconds.”
The studio’s Perforce servers are hosted and managed by Assembla on AWS hardware. “This caters to our backup and security concerns,” Michael McKenna says, “and handles a lot of the important but time-consuming — and, some may say, mundane — maintenance tasks, like allocating disk space and spinning up new servers. For security and some functional reasons, we keep each project in a separate Perforce server,” meaning the data for each production is kept completely discrete. “Assembla makes it easy for us to do that via a web interface, which anyone can use from any location without any knowledge of Perforce administration, which is itself a complex area.”
On set, Michael McKenna says, extra measures are needed to keep Perforce running smoothly. “Internet connectivity is something of a wild card on virtual production sets and oftentimes out of our control,” he says. “With a cloud-based Perforce server, a threat to internet connectivity has the potential to cause disaster. As a fallback solution, we bring with us on set a QNAP NAS box with a fully functional mirrored Perforce server. This ‘local’ server essentially duplicates our cloud server, staying in sync with it constantly. In the event of internet connectivity loss or the unlikely failure of AWS/Assembla, our on-set workstations are configured to seamlessly switch to the local NAS server as a fail-safe without missing a beat.”
Every workstation used for editing images on set has a Perforce client installed on it. “Our on-set protocol means there’s a designated Perforce administrator on each production who’s responsible for installing and configuring each Perforce client and for carrying out any complex version control tasks,” he adds. Additional storage is provided by Seagate Lyve Mobile Array, a ruggedized, high-capacity portable NAS drive with up to 90TB of storage. “We bring it on-set as another safeguard against bandwidth limitations and transfer data by Ethernet and Thunderbolt. If we need to transport large amounts of footage off-site, we have the option to physically carry the data to a nearby data center, where it can be uploaded into the cloud and distributed.”
Amazon Web Services also hosts a Virtual Private Cloud (Amazon VPC) that Final Pixel uses as its main data storage and as virtual workstations. “EC2 G4dn instances are powered by Nvidia hardware and provide comparable performance to local machines, so much so that users are often unaware they’re using remote virtual desktops at all,” he says. “And because these workstations are isolated in the cloud with only encrypted pixel-streaming tunnels using the PCoIP protocol between the remote machines and our artists, they offer excellent security for our clients.”
S3 gives them “low-cost, secure, performant storage that we can scale to meet your needs,” he says. “The same pool of storage is made available between our virtual workstations and, where necessary, securely to our physical workstations.” S3 also hosts the facility digital asset management system. “The DAM is a customized application that leverages the Autodesk ShotGrid API, a Python-based VFX pipeline tool. It catalogs and inventories all of our digital assets and organizes the metadata so artists and producers can easily search our back catalog of virtual assets. This is essential for the efficient functioning of the virtual art department, particularly as our back catalog of assets grows. It also saves us time and money in production of redundant or duplicated assets.”
The end goal, Michael McKenna says, is to leave the brain-numbing admin activity to the DAM so that Final Pixel’s artists can gather assets within their favorite software applications in a form they already understand. “We need to just let them do what they do best: create sumptuous 3D environments.”
Beth Marchant, a former staff and contributing editor of StudioDaily.com, writes about entertainment technology and craft for The Los Angeles Times and IndieWire. Follow her on Twitter@bethmarchant.