By Oliver Peters
Post houses that specialize in national brand advertising face unique challenges when it comes to operations, facility layout and staffing. We caught up with two successful Midwestern editorial facilities to see how they’ve adapted to modern workflows.
We reached out to Utopic co-founder Jan Maitland and Drive Thru president/head of editorial/VFX supervisor Bob George, EP Beth Wilson and Flame artist Aaron Esterling to talk media workflows for spots. Let’s find out more.
Utopic
Utopic is located in a 12,500-square-foot facility in the heart of Chicago and just a short walk away from the city’s ad agencies. They offer a full complement of services — creative editorial through finishing (which covers the traditional online scenario), color grading, audio mixing, sound design and occasionally music. The facility uses two distinct server rooms and two storage systems.
Do you differentiate storage between creative editorial and finishing?
Jan Maitland: It has been, but I hope to consolidate everyone onto one system next year.
What hardware are you talking about?
We had our editors and artists build their own custom PCs years ago to transition them away from Apple’s macOS into Windows. Add-in cards and breakout boxes just worked better there than in the Apple ecosystem. Everyone was super-resistant, so I sat down with everyone individually, and we built their machines so they could understand what goes into it.
We use Ardis Technologies Dynamic Drive Pool storage for our creative editorial side. Ardis is a Dutch company that only builds mass storage solutions for broadcast and post. Our finishing team uses Facilis Technology TerraBlock. Unfortunately, that has reached end of life. That’s a shame because our Facilis has been an absolute trooper, running nonstop for almost a decade.
What post software are you running?
Tim [Kloehn, our co-founder] and I originally started on Avid Media Composer but shifted to Apple Final Cut Pro 7 when we founded Utopic. As a business, it freed us from all the infrastructure that Avid systems required. Many thought we were nuts to do that, but that approach allowed us to move to Adobe Premiere Pro, which was and is like a more robust version of FCP7. So we’re exclusively Premiere Pro for editorial. On the finishing side, it’s DaVinci Resolve and occasionally Nuke. Maxon Cinema 4D and Adobe After Effects are the usual suspects for 2D graphics and 3D animation.
What storage technology is of interest to you moving forward?
Ardis has a couple of new products that I’ve been briefed on. I’m also looking at Avid’s Edit On Demand solutions. They are offering some compelling concepts for remote workflows and remote storage. I’m super-interested in off-site storage, such as the cloud or anything else that lessens our engineering burden. These aren’t quite right for us yet, but we are looking.
How are you operating today? On-prem, hybrid, remote?
We are in a full-hybrid mode. After two years of remote work, everyone is into it now. Hybrid seems to satisfy all the people but also keeps them safe. At first, remote work was a tremendous learning curve under a lot of pressure. Originally everyone took home workstations and used local storage.
Then, we ended up going with Amulet Hotkey, which is a two-part solution. There’s a Teradici PCIe card for PC-over-IP in the workstation at the office, and a dedicated receiver box sits at everyone’s homes — that’s the Amulet Hotkey. This creates a peer-to-peer connection over an ISP from an individual’s home to their machine in the office. They’re able to use the shared storage at work, yet it keeps our clients’ assets safe behind our firewall. Editors, artists and producers have full access to everything on the system.
Have you found any current solutions that work for you to put all the storage in the cloud?
I’m really struggling with the costs and logistics. We would need around 300TB of storage to put everything in the cloud. As a business, I can’t make that happen today. I’ve considered things like a proxy workstation, where all assets get sent to one location, and an individual generates and uploads only proxies. It introduces maybe a three- or four-hour delay from arrival to upload. We have a 5G line at the office, which is fine for remote work. However, it would be challenged if I had a couple of dozen people hanging off of that, plus uploading 4K or 8K files constantly.
What about archiving assets and projects?
We’ve gone a little punk rock on that one in that we’ve stopped doing LTO. It was too burdensome early in the pandemic. That really introduced a paradigm shift. LTO was so unmanageable with 20 to 24 distinct offices, meaning when our employees were sheltering in place and using their local storage. Plus, like many post houses, we weren’t charging for that. There’s this assumption that, hey, you did the job, so you’re going to hold my assets in perpetuity. One day I said, let’s not do that. Let’s ditch LTO. Of course, we’ll still unarchive old material as needed.
From that day moving forward, I decided that we own our work product, but we don’t own the original camera assets. Let’s store our work product and elements in a cloud solution, but not the assets that we don’t own. We beefed up our plan with Frame.io and use their archival storage, which relies on Amazon S3 Glacier. All finished assets and project files go into a long-term cloud storage. Then all production assets that were provided to us go back to the client or agency in a timely fashion.
Any final thoughts about what you’d like to see out of the technology in the future?
Post is always looking for reliability. Storage is the mission-critical part of our workflow. If a computer goes down, I can swap it out quickly. That’s not the same with a storage system. It’s the beating heart of our workflow and gets an appropriate amount of attention paid to it.
I would love to see affordable NAND or SSD storage. Then we could get rid of spinning disks with their heat and wear and tear. Hard drives only do three things — read, write and die!
Drive Thru
Drive Thru is a full-service production and post company located in Minneapolis. Its bread-and-butter work comes from ad agencies working on national brands. On the post side of Drive Thru, there’s a team of Flame artists, online artists, creative offline editors, 2D/3D motion graphics artists and a Company 3 digital outpost.
Let’s find out more.
Bob George: We were one of the first IP-to-IP connections with Company 3. We use the Aspera server to transfer files between locations. They render it there, and it comes directly back to our centralized server.
What kind of storage solution are you using?
Aaron Esterling: Our office network backbone is 40Gb Ethernet, and all the computers connect to it at either 40GbE or 10GbE. We’re using storage servers from GB Labs called Space. Currently we have over 200TB of high-speed storage on-site and many times that amount backed up to LTO.
George: I go way back with the owner of GB Labs. I went to NAB to find out who could make me a network server that could do InfiniBand at 40GbE for the Flames. Nobody else wanted to do it, but he did it. We had the first one out there, and Technicolor in Australia had another. It served us really well, and we’re still working with GB Labs.
We’re also testing another proprietary GB Labs solution called Unify Hub to be used for remote access. The files that our editors use all exist in the office. When they work from home, the system will cache only the files they need for that project on their computer.
Esterling: It’s convenient because the file path remains the same at home as at the office, so there’s no need to relink or manually move files. The media gets cached on the editor’s local computer, and they can open the project and continue working right where they left off at the office.
George: We’ve also used Teradici during the pandemic. Everybody was working from home, and the Flame artists were controlling their Flame systems remotely. All the heavy data-lifting stayed in the office. The offline editors took their computers home at the time, but with Teradici for Mac, we can bring those computers back to the office and continue to support remote workflows.
How do you archive media?
Esterling: We’ve been using LTO for years — Archiware’s P5 archive software with a Quantum Scalar i3. The robot has 50 tape slots and two internal LTO-8 tape drives and can write around 15TB per tape. This allows us to run nightly backups of everything on the server to LTO. Each completed project is also archived in its entirety to LTO.
George: What’s amazing about that system and the way we’ve deployed it is that after a mishap years ago, we were able to completely restore the system and have everybody up and running within four hours. Nobody lost anything because of the frequency of our backups and the way we do it.
How does the cloud fit into your plans for assets and archival needs?
George: There’s no way we can get a fast enough connection to take care of the amount of data we go through. We just take care of it here. If you had to bring back a 2TB project from the cloud and needed it today, that would not be possible.
What about review and approval?
Wilson: We are constantly evolving in how we post and distribute deliverables. We use several different posting platforms but are typically using Interdubs or Frame.io. Clients are gravitating to Frame.io due to the ability to give direct feedback.
George: Internally, we also use ShotGrid. We can publish our Flame timeline to ShotGrid and track each shot individually with notes for the artist and circle things that need to be fixed. It’s also useful when we work with remote artists in the EU on some projects. They look at the ShotGrid web page, get the files, work on them and send them back.
We’re typically using uncompressed DPX files that are uploaded back to our server. ShotGrid is linked with the Flame timeline, so each new version of a shot is automatically available in Flame. It’s a neat tool for workgroup collaboration because it eliminates all the tedious steps and lets us work more efficiently.
How have your systems and procedures evolved over the years?
George: Years ago, you’d walk a hard drive to a computer and plug it in — the so-called sneakernet. I wanted to get rid of that because it wasted so much time. We’ve now gotten it to a point where you can go to any room and just get down to work. Having established procedures has saved us a lot of time, thanks to maybe a more advanced storage system than most companies our size would want to spend money on.
Wilson: Workflow protocols are very important to us as we start each job. We have strict procedures in place from the time a drive lands in our office to where it is stored to how the footage is prepped and backed up. The work that Aaron and Bob have done to establish the storage and backup protocols has helped us tremendously, especially in a time when you have hybrid workflows and artists working remotely.
George: Years ago, I would do all the technical and Flame work. We added people like Aaron and others to take over my Flame seat and the technical responsibilities. They grow with the company so that we can maintain those connections with vendors and keep our tools current and get what we need. We’re passing things down to the next group of people so that they can keep the company moving in the same tradition as we have in the past.
Oliver Peters is an award-winning editor/colorist working in commercials, corporate communications, television shows and films.