NBCUni 9.5.23

Running Man and IDC: Post Storage

By Ben Mehlman

Whether you’re doing post on a comedy special for Netflix or just trying to text a friend, the last message you want to see pop up is one saying your storage is almost full. For a post facility in 2022, a solid storage setup is indispensable.

We sat down with post houses Running Man and IDC to discuss what their current and future storage solutions are.

Running Man

Troy Thompson

New York-based Running Man is a full-service post house known for its work for Netflix, HBO, FX, Amazon, Showtime, CBS and many others. Troy Thompson, who co-founded Running Man almost 14 years ago, takes pride in the company’s client-forward approach and carefully crafted facilities that help set it apart.

We reached out to him to talk storage and post.

Can you tell us about Running Man’s history?
We started in 2009 in the wake of the housing financial crisis when a company I was a partner at could no longer sustain itself. I’d been designing and building rooms since the mid ‘90s, so I wanted to take my knowledge and experience into a new space that was built from the ground up — one that focused on both documentary and scripted content. We outgrew our first space after five years and moved to our current home, which was double the size. Then we acquired the rest of the floor and doubled in size again.

Something that differentiates us from other companies is that we’ve always been a small company at heart and appreciate the hands-on attention that allows us to give. This feels extra important in this current environment where everything’s consolidating and getting bigger and bigger. We’re owner/operators, so we recognize that we spend more time here than at home. That’s why we put a lot of care into our rooms and environment. We want the space to feel as comfortable as possible.

Can you tell us about your setup?
All the rooms are connected to shared storage, have 55- to 65-inch client monitors and support both Adobe Premiere and Avid Media Composer workflows. During the pandemic we saw an opportunity to do a turnover of the hardware in the edit rooms and replaced the 2013 Mac Pros with the new Mac Studios. They are phenomenal, despite some initial minor hiccups because they’re run on M1 technology. Overall, it’s been a huge performance upgrade. So full setups, client couches and sit/stand desks. All in all, we try to give the editors whatever they need to have a comfortable and creative space.

How are you supporting remote work?
If the editor is working on a Mac, we mostly use Jump. With Avid we prefer to run Teradici, but that means we’d really need to be on a PC host. We’re anxious for in-person to return given that we have real estate that is meant to support in-person work and would like to see it used again. We’ve had a few jobs come back, and it feels a lot more creative and collaborative than what we’ve experienced on the remote side of things.

We’ve used Frame.io as a review tool, and that’s great. One of the other things we’re looking at is instead of virtualizing the desktop experience on an editor’s computer we want to give them access to the shared volume they’re working on. We’ve seen advances from Avid, Facilis and LucidLink and are exploring all of those as approaches to remote workflow.

What storage systems do you use to support all of that?
We’ve been with Facilis for a long time. They support both our editorial and finishing systems. When we first moved in, we wired everything with 10Gb Ethernet to futureproof as much as possible, which means all editorial clients have 10Gb bandwidth to the storage. Our color systems are connected through 32Gb fiber due to the demands of working with uncompressed data. So in total we support four color rooms and 13 offline workstations on about 300TB of shared storage. The entire system is comprised of a hub server and three 24D chassis.

And what are you using for color?
We had been using Digital Vision’s Nucoda up until the pandemic, and would probably still be using it if COVID didn’t happen. It’s a great system, but it isn’t as flexible as DaVinci Resolve when it comes to building out in different sizes and configurations. So now all our color work is done on DaVinci Resolve and all our mastering is done on ColorFront Transkoder.

What made you decide to go with this particular storage system?
Performance. At the time, we weren’t seeing the same level of performance from any other solution. Then, when it came time to put the color systems on shared storage, it made sense to start looking there as well. With Facilis, we could put in a hub server and connect everything through direct fiber without getting into a massive infrastructure build that was beyond the scale of what we were looking to put in.

What do you do for archival?
We try to keep everything on Facilis for as long as we can, which means we usually hold onto a project for about a month before we have to decide to remove it. This gives most shows enough time to get everything delivered and QC’d. Then we backup our internal master outputs of everything we work on to LTO-7. Some things get backed up as TIFF or DPX sequences, and then others get backed up as ProRes 4444 XQ files.

When on-site, are you only doing online edits?
A few shows have come in and done offline. Obviously, we had moved everyone off-site to keep the business running and keep people employed, but certain things just weren’t possible to do unless we were in the office. So our core staff has been back in the office for about two years. The first year, all editorial work was remote, and we did finishing work on-prem. But now we’re starting to see some things return.

Part of the downside of being remote was related to efficiency; we found that we were about 30% less efficient than when we’re in the office. A lot of that has to do with the amount of time it takes somebody to get back to you. You’re not one room over and able to walk to someone to figure something out in real time.

Within the last year, we’ve had a few things come back, which has presented other challenges. When you have everybody remote, you attack it one way, and when you have everybody in office, you attack it another. But having a hybrid model is not quite as easy to pull off in what I consider to be a successful way. This is another reason why we’d like to see things move back in to the office a bit more. It would make things more efficient and creative overall.

How has this evolution of in-person and remote work affected your storage needs and other issues like security?
It’s super-challenging. It’s obviously great that people can work from home, but it’s incredibly difficult to maintain the same level of security. Recently, we had something happen when we were testing Facilis’ ability to remotely mount our partitions onto a remote client. The test run on the editor’s system went amazingly well; we were able to do nine real-time streams in Media Composer with an editor who had a great internet connection. But then we couldn’t even play a timeline with the assistant editor due to that person’s internet speed.

So you’re not only trying to satisfy a multitude of setups in terms of their own hardware and internet, but when you put these pieces out into the world it gets challenging to maintain an efficient collaborative workspace. This was fine when we were in the pandemic, and this new reality was based on people’s safety, but it’s challenging to maintain when you compare remote work to in-office.

Part of why Running Man was born was to control the stakes and control how the room looked. I started as an editor and hated going into a facility and being disappointed by even the mouse they had. I wanted to elevate all those components, make them my own and make it as comfortable as possible. But getting back to your original question, it’s challenging when you’re trying to overcome internet connection speeds of remote users and having to make sure everything is still secure.

Have inflation or supply chain problems caused any storage issues? Practical or financial?
For sure. There are a few things going on. We’ve been trying to accommodate people’s budget restrictions, which I think might be coming from different areas. One is fear of inflation, as costs have been rising across the board. So things cost more to bring in to the office, and yet there’s a restrictive push downward on budgets. I also think some of that is the extra costs productions had to incur to keep going through the pandemic. So it’s tough when you couple being asked to lower prices while our vendors are asking us to increase our spending on hardware.

You have to be smart about where you’re spending your money. We used Signiant Media Shuttle through the pandemic to keep us operational, and it was indispensable, so that’s something we’re not going to back away from. On the hardware side, we tried to make smart choices about what hardware we were willing to put in our staff’s houses, and that was challenging. Part of what’s been great about getting back into the office is that computers are finally back in stock, so we can bring them in and get the rooms set up again with the latest hardware available.

Thankfully, we made a fairly substantial investment in Facilis right before the pandemic, so we had that in place and weren’t limited by storage at all. But it did affect us more when we first got in the office last year. And while we’re still faced with higher costs, the supply chain stuff is thankfully starting to less of an issue.

IDC

Marcy Gilbert

Bicoastal post house IDC started in New York City over 30 years ago, opening a Los Angeles office in 2019. CEO Marcy Gilbert runs things from New York while getting help in LA from COO Rosanna Marino, director of production engineering Mike Tosti and director of emerging formats and mastering Ryan Gladden.

The four of them spoke with us about their setup, tools and storage workflows, which they are currently reevaluating due to the need for more capacity. Here they talk about their varied services, data-intensive workflows and why having the right speed and capacity is so important.

Can you give us some history on IDC? 
Marcy Gilbert: I opened IDC over 30 years ago. We started as a duplication and standards conversion company sending physical tapes all around the world in PAL, SECAM and NTSC. Now we’re a full-fledged cloud-based post company specializing in audio post, editing, audio transcriptions, localization, media and processing.

Ryan Gladden

How are your facilities in New York and LA set up?
Ryan Gladden: In LA, we have two media processing rooms that are used for day-to-day asset creations, including edit and audio conforms, PSE fixes and standards transcoding work. We’re using Premiere, Resolve, Clipster and Transkoder.

Another part of that media processing, even though it doesn’t have a room, is a Telestream Vantage system that we leverage for a lot of automated workflows. This reads file formats and evaluates audio and picture without requiring an operator to do anything. It takes work that could take someone a week and does it in a day, allowing us to stay tiny but mighty. This also allows our people more time to focus on the art. We want an operator to be able to focus on doing a PSE fix that looks amazing and not have to waste time making proxies or worrying if they have the timecode in the right spot.

We also have recording bricks and two sound mixing rooms. One is a standard Avid Pro Tools bay that does the recording and the other is a full Atmos mixing room, all Dolby-certified.

Rosanna Marino

We have a DI theater with a full Resolve setup, a 4K Barco projector and an X310 for doing HDR work for home video and Dolby Vision. We also have a master QC bay with an X310 for HDR work with full Dolby Vision tools to do metadata analysis. This allows us to make sure everything is properly set up and staying within the correct color space based on the client’s specifications.

Rosanna Marino: And that’s why having all that work storage is very important (laughs).

Gladden: Incredibly important. We’re working with very large files — TIFF sequences, UHD, ProRes. And because of our automation, we’ll get multiple seasons of assets that we have to process. All of that has to be online at the same time because it’s all being worked on at the same time.

What storage systems are you using to support these workflows? 
Mike Tosti: Currently, we have an OpenDrive all-flash drive called Atlas, but it only has 107TB. We recently put together a 200TB TrueNAS just for nearline to bring things on and off. We’re constantly moving projects back and forth to free up space on the OpenDrives to continue our workflow. We’re currently at the point where we will need more.

Mike Tosti

Marino: We’ve been doing a lot of research and have taken a lot of time to thoroughly work with what we felt were the top storage companies. This includes the manufacturers, not just the reseller. We want to know and understand their systems so we can make the best decision for us — a decision that we’re at the tail end of making, which is why we can’t be too specific about our choice just yet.

Tosti: We have an amazing data management department that migrates the data back and forth and lets us continue to work, but we need to bring in more storage soon.

Marino: We’re looking to upgrade to almost a petabyte worth of storage.

What other factors, besides storage, are guiding this upgrade?
Gladden: A big factor is what data management tools are included with it or what data management tools we can layer on top of it to automate the process even more. We want to do the same thing with our data management that we’ve done with our file workflows.

For example, we’ve been doing a lot of disaster recovery backups into the cloud, but how can we fold that in better with the storage itself for additional checks and balances? We’re looking at a lot of different tools that will allow us to have life cycle policies on assets that are similar to what you would have in a cloud volume, where a file is recognized as not having been touched for a period of time and then is moved to a lower tier of storage. Then, if sits there untouched for like 60 more days, it’s time for it to go to the cloud. We want all that to happen automatically. We’ll still have access to all of it, but it’s now on a deeper shelf, which opens up space on our higher-speed storage for what we’re currently focusing on.

Marino: We’re very proactive on having clean storage. A lot of companies are too busy and don’t think about it. Then they have no idea what to do with all this content they have online and nearline because they haven’t taken the time to create solid retention policies. We have all these policies implemented and in place; we’re just looking to grow the scale.

Gladden: To be more direct, the main factors we’re using are speed, expandability, support, data management, current infrastructure and space, as well as cost. For speed, we’re looking at what drive technologies are being used, what protocols are supported by the systems, what has native client support for higher-throughput systems, how much CPU power is provided with the systems so data going through SMB can transferred quickly, and whether the systems are profiled for the types of applications we’re using.

For expandability, how much extra space is there for additional drives? Can third-party file systems be integrated so any data management tools that come with it can be leveraged on those drives? Are they using off-the-shelf hardware? In a worst-case scenario, if the vendor disappears, can we take that hardware and rebuild it into something else?

For support, we’d like a vendor that has local support in LA and to make sure we understand what SLAs are being provided for that service.

For data management, besides the vision we already explained, what are the data management tools provided with the system? How do they work? How strong and flexible are they? How do they integrate with the cloud? What disaster recovery options are there?

For current infrastructure and space, we want to make sure we’re not getting something super-power-hungry that’s going to drive up our power requirements and cooling. Also, how much rack space will it be taking up?

Finally, costs. What are the upfront costs? Yearly costs for support? Cost for additional services and tools that might be required? We then break that down into a terabyte cost over three years.

What are the connection speeds you have in New York and LA?
Tosti: We have a 10Gb circuit in LA, a 2.5Gb circuit in New York and a 100Gb Mellanox switch for our storage backbone. Then, depending on the system — whether it be Mac, Linux or Windows — we have either a 40Gb, 50Gb or 100Gb host bus adapter. For our Resolve, we have a 100Gb Mellanox card to get the ultimate performance for clients sitting in the room.

How did you handle workflows during the height of the pandemic? 
Tosti: Our artists worked from home using a VPN client accessing the equipment in the facility through Jump Desktop or RDP, but we’ve since phased out RDP. We were all very conscious about security. No one wanted any content leaks, so we didn’t allow any content to be brought to anyone’s house; they could only access it through secure connections to the facility.

Gladden: A lot of what we learned during the pandemic has allowed the two facilities to work like they’re physically next to each other. For example, New York is able to find talent in LA; they can come in here, and then we can use something like Source-Connect to link our two Pro Tools systems. This would allow a director and mixer to be in New York while our people in LA make a recording as a backup. It would also allow everyone to talk to each other like they’re in the same facility.

Have inflation and problems around the supply chain affected your search for new storage solutions?
Tosti:It’s playing a big part. We’re trying to get a Mellanox upgrade for a future expansion. The order was placed in April and we’re still waiting for it. They said November, but I think it has slipped into December. In another example, we tried to order another KVM system, and they said it’ll be six months until we get it. So supply chain is a huge issue.

Marino: It was a big factor in looking for new storage. How long will it take for the implementation of the equipment? Would they have everything, including the cables? You don’t want to find yourself missing that one piece that prevents you from finalizing after making this huge investment.

It’s interesting, we were talking with one vendor about an expansion of our routers, and they offered to swap out ours with the one we needed without us having to pay for it. So in that sense, because of the pandemic, everybody has also really come together and helped support each other and give honest answers. We know that shifting delivery dates are not the vendors’ fault. It’s just the nature of the beast right now.

The good thing is that we order what we need in advance of really needing it, so we have the runway time already built in.


Ben Mehlman is a writer/director. His script Whittier was featured on the 2021 Annual Black List after being selected for the 2020 Black List Feature Lab, where he was mentored by Beau Willimon and Jack Thorne. 


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.