Author Archives: Ashley Haley

Life in Tandem: Making an Unexpected Documentary

Though poignant and beautiful, this wasn’t the documentary the filmmakers originally set out to make. Here we talk with one of the directors, Mia Grimes, about how the film unfolded and the process of making it.

L-R: Chris Multop, Joe Litzinger and Mia Grimes

How did you come up with the idea for the short?
My co-director Joe Litzinger discovered a viral YouTube video of Marc Ornstein performing a canoe dancing routine to “Lady in Red” as well as a video of Stephen Colbert poking fun at it. Intrigued by the sport and the individual in the video, we did some research and reached out to Elaine Mravetz, a pivotal figure within the community. We were immediately struck by her warm and inviting demeanor.

Tragically, just days after our initial conversation, Elaine was killed in a car accident. With the blessing of both the freestyle community and Elaine’s family, we pivoted the documentary to follow her husband, Bob (also a canoeist), on his journey of recovery and grief.

The original concept was to take a Best in Show approach to a unique sport, but it evolved into a heartfelt emotional story about a community rallying around a member facing a tragic and unimaginable life change.

Did you guys fund it on your own?
My co-director funded the short through his production company, Interesting Human Media, using personal funds. While we attempted to raise additional money, the unexpected nature of the life event we were documenting meant we had to adapt and tell the story with the resources available to us while it was happening.

And we received a great many contributions of time, resources and work at reduced rates from friends and co-workers, embodying the essence of this project as a true labor of love and a community coming together for a common purpose.

What was the process of just getting it off the ground?
In early February 2022, cinematographer Jeff Smee and I made our way to film at Bob’s house in Cleveland. This initial three-day filming session with Bob was just the first of many. Over the course of the following year, we were invited to document a series of significant events marking Bob’s journey of recovery. These events offered a lens into his resilience and his gradual return to the activities that once brought him joy.

It was during a trip to Florida in February 2023 that we witnessed Bob return back to the water in his canoe for the first time since his accident — a symbolic act of reclaiming his passion and a step forward in his healing process. This experience provided a natural and powerful conclusion to our film, capturing the essence of human perseverance and the support of a community rallying around one of its own.

Can you talk script?
Because we were following an event, we did not have a script or outline of any kind, as we were not sure how Bob’s recovery would progress. We truly had no idea how the documentary would end pretty much the entire time we were filming.

Was this your first time directing? How did you work with your co-director, Joe?
I started out in logistics and scheduling, but my role quickly expanded as I found myself involved in all aspects of the production process. This transition marked the beginning of a learning experience that extended far beyond my initial responsibilities. Joe, who served not only as my boss but also as my co-director, played a pivotal role in this evolution. In an industry where the hierarchical structure is often rigid, Joe’s decision to trust me with the direction of early scenes was indicative of his inclusive leadership style.

This opportunity allowed me to learn directly from Joe and the cinematographer, Chris Multop, about not only the technical aspects of filmmaking and camera operation but the storytelling.

As the project progressed, our partnership evolved into a collaborative co-directing effort. This collaboration was not limited to just Joe and me; Chris, our co-producer, was integral as well. Together, the three of us functioned as a cohesive unit, with each of us bringing our own perspectives, expertise and visions to the table.

How did you decide on the cameras you used?
To capture the sport’s beauty, we needed high-quality, versatile cameras that were also light, portable and affordable. Most of the documentary was shot using Z cameras in 4K, with a mix of ultrawide, stylistic lenses for interviews and 800mm lenses for paddling and cinematic shots. Other cameras we used during production were Sony FX3, multiple drones and a Blackmagic camera.

Was it shot with natural lighting?
While the canoeing scenes benefited from natural lighting, we used artificial lighting for the indoor interviews to enhance the visual quality.

You had multiple DPs?
Chris Multop, our co-producer, served as the director of photography, but it was a collaborative effort, with Joe, Jeff Smee, me and others on-set contributing to the cinematography alongside archival footage from the canoeists.

You edited on Adobe Premiere. What was that process like?
We have edited a variety of projects on a variety of platforms. We decided on Premiere because we liked the ease and capability of sending the project to multiple editors to play around with.

One of the things we did early on was hire an experienced AE, Ken Ren, who organized the drive and synced the footage, so our projects started in a way that gave us a leg up throughout the editing process. With about 8TB of footage, we relied on proxies to keep the editing process smooth.

Who did the actual editing? And what about the audio and color grading?
Editing was a collective effort led by Joe and me, with contributions from Emmy award-winning editors Matt Mercer and Eric Schrader and assistant editing by Jenny Hochberg. We set out to film a feature, so we were managing a large amount of footage, which presented a significant challenge in crafting a short, concise documentary.

You can watch the doc here:

Post Production World Expands: New Conference Pass and AI Training

Future Media Conferences and NAB Show have expanded the Post Production World (PPW) conference slated for April 12-17. This year the organizers introduced a comprehensive pass that covers an expanded suite of tracks along with AI training and certifications, field workshops and more.

In a move to cater to the broad spectrum of roles in the creative industry, PPW has broadened its scope to include additional past FMC conferences under one ticket item. Attendees can now access a diverse array of tracks with a single ticket, exploring creative AI, cinematography and directors of photography, visual storytelling, remote production and more. This expansion reflects PPW’s dedication to keeping pace with the rapid advancements in technology and creative techniques.

In addition to a dedicated Creative AI track within the PPW conference program, FMC is offering an additional pass for an AI Training & Certifications track, an initiative designed to equip professionals with the skills necessary to navigate the burgeoning field of artificial intelligence in content creation. Pass add-ons include exam vouchers available for purchase with registration or a choice between two live and in-person AI training courses:

  • AI Broadcast TV Training Workshop: Revolutionizing Broadcasting
  • AI VFX & Motion Training Workshop: Crafting Visual Wonders

Besides these new additions, PPW continues to offer field workshops and other certifications that provide hands-on learning experiences and opportunities to gain recognized credentials in various aspects of production and post production.

“By expanding our tracks and introducing AI Training & Certifications, we’re not just responding to the industry’s current trends; we’re anticipating its future directions,” says Ben Kozuch, president and co-founder of Future Media Conferences. “Our goal is to empower content professionals with the knowledge, skills and insights they need to succeed in a rapidly evolving landscape.”

Information on the new pass options, AI Training & Certifications, field workshops and registration can be found here.

Getting the Right Look for Oscar-Nominated Anatomy of a Fall

Securing the Palme d’Or at the Cannes Film Festival and clinching five Oscar nominations, Anatomy of a Fall is a gripping family saga unraveling the startling collapse of an ordinary household. Under the helm of Justine Triet, her fourth directorial venture paints a dizzying portrayal of a woman accused of her husband’s murder, set amidst a suffocating ambiance. Graded at M141, colorist Magali Léonard from Chroma Shapers shares her workflow on this film, discussing both the artistic and technical details.

“Justine and director of photography Simon Beaufils reached out to me early on, even before the filming commenced, during the camera trials. I had previously collaborated on the grade for Justine’s Sibyl, a project where Simon also served as the lensman. This marked my second project with Justine and sixth with Simon,” says Léonard.

The director and DP worked closely with Léonard, who worked on Blackmagic DaVinci Resolve Studio, throughout the entire post process, making sure the film’s feel translated to the screen.

“Justine envisioned a raw, contrasting narrative embracing imperfections and flaws, aiming to create something visceral and sensual,” explains Léonard. “This vision particularly manifested in the trial sequences, characterized by flushed skin tones, sweat and tangible fatigue.

“I translated that vision alongside Simon’s directives into the visuals, meticulously attending to facial expressions and skin tones,” she continues. “We closely collaborated in crafting a visual identity, starting with extensive camera trials during preproduction involving hair, makeup and costumes.”

During the initial phases, Beaufils conducted tests on 2-perf 35mm film, allowing Léonard to emulate the film’s appearance when calibrating the digital camera tests. “This served as the cornerstone to unearth the film’s ambiance and visual identity,” she says.

Triet and Beaufils opted for a large-format camera paired with Hawk V lite anamorphic lenses, despite the film’s aspect ratio of 1.85. “The anamorphic lenses infused a richness of colors, flares and distinct blurs, softening the digital sharpness of the sensor. Simon was a pleasure to collaborate with, crafting exquisite imagery encapsulating intricate emotions,” she adds.

“My approach to the visuals was iterative, manipulating contrast through DaVinci Resolve’s custom curves, followed by adjustments in colors, saturation, and highlights. Subsequently, I introduced grain to impart a more pronounced aesthetic, a process initiated from the rushes onwards, laying the groundwork for the film’s overarching mood,” Léonard shares.

Refinement and Collaborative Efforts
In the later stages of the digital intermediate process, Léonard revisited the nodes used to establish the visual identity for fine-tuning. “I ventured into more daring suggestions, striving to refine highlights and specular lights while infusing subtle diffusion. For instance, we enhanced the saturation in the blues while preserving the rawness inherent in the set design and costumes,” she elaborates.

For the courtroom sequences, the grade underwent an evolution mirroring the unfolding of the trial toward a denser, golden atmosphere. “It was crucial to accentuate the actors’ facial expressions while retaining the initial appearance of a slightly rugged and textured visual, a tangible and vibrant material,” says Léonard. “I embraced the notion of allowing the visuals to unfold their utmost potential as the narrative progresses.

“Throughout the grading process, we frequented the Max Linder Cinema to screen the film under theatrical conditions, gaining insights into the visuals and enabling me to make finer adjustments to the final look. For instance, through these screenings, we discerned that certain scenes would benefit from heightened saturation or contrast,” she concludes.

Versatile Opens Seamless LED Volume in Vancouver

Film production technology provider Versatile Media has opened a new virtual production facility in South Burnaby, part of the Vancouver metro area. The 44,000-square-foot building features two large soundstages, one of which houses what Versatile says is North America’s first enclosed volume with a seamless ceiling. The building houses a main stage with a bespoke LED volume; a secondary, 13,000-square-foot soundstage for use as traditional filming space; and 10,000 square feet of production offices.

The volume itself stands 83 feet wide and 29 feet high and has an immersive, 270-degree curvature. The seamless structure is equipped with the latest LED panels and technology and was purpose-built for large-scale film projects. Running on Nvidia’s GPU technology and using RTX 6000 Ada Generation GPUs, the facility’s technology supports filming at 8K resolution and can shoot with multiple cameras on-set.

What sets the new Versatile volume apart is the seamless integration of the ceiling and the wall, allowing for uninterrupted shot lensing across the entire volume. This means the ceiling is not just for reflections and lighting but a part of the in-camera framing.

To merge traditional, live-action workflows into the volume setting, ensuring that virtual production adapts to live action as closely as possible, Versatile collaborated with Vancouver-based rigging expert Dave McIntosh. McIntosh crafted the bespoke ceiling structure complete with essential catwalk platforms that ensure easy access to the ceiling portion of the volume.

McIntosh and Versatile worked together to engineer the mechanics of the unique ceiling, allowing efficient removal of LED panels so productions can seamlessly integrate diverse filming equipment. This adaptable solution ensures easy access to sets, makes it possible to suspend sets within the volume and facilitates the integration of lighting equipment. It also creates a convenient way for special effects teams to achieve complex and expansive shots and stunts.

“This adaptability opens up new possibilities for filmmakers using virtual production, making it easier to work on the volume and achieve complex shots,” says McIntosh. “It’s a great example of how collaboration in the film industry drives innovation.”

Versatile collaborated with Sohonet to provide production-grade connectivity and networking infrastructure that links Versatile’s Vancouver previsualization studio with the newly built Burnaby stages.

Why Egress Fees are Holding Back M&E

By James Flores

Hollywood has the reputation of being an industry at the forefront of technology, thanks to the advancements of digital filmmaking, VFX and virtual production. But there’s one area where the media and entertainment industry falls short of other industries: new technology powering how files get shared and distributed.

Instead of simply uploading the digital assets of a shoot to the cloud and working remotely, many production companies are still moving physical hard drives as if the internet had never been invented. This is because of a hidden cost involved with the major cloud providers — egress fees (aka download fees). These fees can quickly spiral out of control when a studio tries to embrace the cloud model for all digital assets. Because studios don’t want to run up expensive bills with cloud providers, they’ve now built an entire ecosystem of workarounds to get video files off of sets and into post.

These ecosystems are draining resources by adding complication and subtracting budget, and they are ultimately just as damaging as paying egress fees. The M&E field is small but produces incredible amounts of data. The status quo cloud business model involving egress fees is holding our industry back from taking full advantage of the cloud and unlocking new innovations.

What Is Cloud Egress?
One reason that major cloud providers generate such massive profits is the number of fees and additional charges that they tack on, oftentimes without transparency. This results in huge surprise bills at the end of the month. Egress fees are the cost to customers whenever they move their data files out of the provider’s platform. The average egress fee is $0.09 per gigabyte transferred from storage, regardless of use case. But specific costs are not always apparent and can be difficult to predict. In fact, there’s an entire subindustry of consultants and service providers that manage cloud costs on an organization’s behalf (collecting their own fee in the process). The various fees and charges that don’t seem like much at first glance — or that are presented as just the cost of doing business — quickly add up within common M&E workflows.

The average file size from shooting a single take of a scene is several gigabytes, meaning that even one day of shooting creates a huge price tag anytime footage gets moved in and out of the cloud for multiple rounds of digital effects and editing. It makes planning expenses in advance extremely difficult, as filmmakers can’t know how much it will cost until they’ve uploaded their work to the cloud and started editing. With this virtual roadblock in place, it’s not surprising that many M&E companies feel that it’s unfeasible to embrace the cloud.

The Production Company Hard Drive Ecosystem
In the absence of cloud storage, an ecosystem of hand-delivering hard drives has sprung up to move and protect video files, which is not necessarily beneficial to a production company’s finances. Here’s how it works:

A specialized courier industry exists to serve production teams that need to physically send files to the right location. There are a number of issues with this approach. First, it creates a delay between filming and post production that can be anywhere from a few hours to several weeks, depending on the distance between the shooting location and the editing rooms.

Second, this process generates unnecessary costs. What immediately comes to mind are the packaging, courier and other travel fees from carrying those files around. But there are hidden costs as well. Companies will have to purchase multiple hard drives as the devices wear out, and they must keep up to dozens of drives on hand at set locations, depending on the duration of a particular shoot. And if those drives get lost or damaged, then the entire cost of shooting is wasted, and expensive reshoots become necessary.

Finally, those digital assets on hard drives aren’t necessarily safe. The danger of transporting on-premise (hard drive-stored) work means drives can be lost, held up by a foreign country’s customs department or even stolen if the production is high-profile enough. This adds even more cost for security and transportation experts to protect files against each of these threats.

There will always be some need for hard drives on shoots, such as in remote locations without internet connectivity, which therefore requires temporary storage. However, looking at the costs generated by this on-premise, physical transfer ecosystem, it seems fair to ask what it would look like if that wasn’t the case.

What Could Happen Instead
What’s next is the advent of cloud workflows. Cloud technology has reshaped how most businesses operate. The same is true for the M&E industry. Many different technologies offer the ability to take data (media) directly from a camera’s encoder and move it to the cloud. These camera-to-cloud technologies often create their own data silos; data can only go into the given vendor’s cloud storage, and moving it to other tools invokes costly egress charges. With cheaper cloud egress fees — or even no cloud egress fees at all — production teams could more readily use this cloud workflow, opening up room in studio budgets and speeding up their production time thanks to the elimination of the hard drive ecosystem. This could level the playing field for smaller production companies, as they’d be able to film content much more efficiently.

Companies could focus security investments into digital security, which can be much cheaper than physical methods. Instead of trained guards, companies could rely on encrypted backups and object lock, wherein a user can designate certain objects to be immutable, meaning they cannot be altered or deleted by outsiders and thus are safe from ransomware. They’d also be free to move a lot more post production tools and editing techniques to the cloud, and they could pick and choose where they want to store data or which tools they want to use without worrying about what cloud provider they’d be stuck with.

It’s Time for a Change
With the WGA/SAG-AFTRA strikes and negotiations thankfully behind us, there’s going to be pressure on everyone to get new films and shows finished as soon as possible. These condensed timelines mean it’s now time to talk about why it’s acceptable to waste so much money and time on outdated manual processes. This question is not just for the M&E industry but for the cloud industry as well. By keeping exorbitant egress fees in place, cloud providers hurt their own businesses and limit production companies’ potential. Eliminating, or simply reducing them, would be a net benefit for everyone involved.


James Flores is who has been a working video editor/assistant editor and DIT for over 25 years. He is currently product marketing manager M&E at Backblaze.

New Atlux λ Plugin for Lighting, Cinematics, Rendering in Unreal 

Indie software company Vinzi has released Atlux λ, a plugin for Unreal Engine that helps 3D artists produce hyperrealistic images and animations with ease. Formerly known as MetaShoot, Atlux λ has been reimagined with an array of features designed to simplify lighting and rendering workflows and achieve real-time results.

Built on Epic Games robust Unreal Engine platform, Atlux λ serves as a digital-twin photo studio with highly realistic lighting assets, camera animation presets and a one-click render interface. The plugin’s intuitive design and simplified workflow make it an ideal entry point for 3D artists seeking to harness Unreal Engine’s real-time capabilities without being encumbered by technical intricacies.

Early adopters of Atlux include Hashbane Interactive, Sentient Arts, FD Design and R3plica.

“Atlux λ is a labor of love based on my years of experience working in the 3D industry as an artist and engineer,” says Vinzi founder Jorge “Valle” Hurtado. “The goal is to make lighting and visualization in Unreal Engine as creative and fast as possible. We have customers using Atlux λ for games, architecture, character development, product viz and even automotive.

“Atlux λ is not just a rebrand of MetaShoot; it’s a fully rewritten and optimized plugin that now introduces light painting, a sequence tab for animation and even an AI-based studio randomizer. There’s a lot there! With Atlux λ, 3D artists can create showcase animations from camera motion presets without the complexity of the Unreal render queue or sequencer modules.”

Early adopters of Atlux have quickly incorporated the tool into their workflow. According to Anthony Carmona, founder of 3D production studio Sentient Art, “MetaShoot, and now Atlux, blows away all our expectations. Having access to assets and instant lighting results speeds up our ability to produce amazing work for our clients. It’s perfect for rendering our highly detailed models and material work — from concepts to flawless portfolios.”

What’s New in Atlux λ:

  • AI Studio Randomizer: New studio randomizer builds unlimited photo studio setups in seconds.
  • Light Painting: A new interactive way to place lights based on visual feedback, cursor placement and keyboard shortcuts.
  • Sequence Tab: A new sequence tab with assets and Rig Rail presets to quickly build animations. Includes a NeRF maker and automatic level sequence creation.
  • New lighting and Photo Studio presets.
  • Keyboard shortcuts: easy camera selection and toggling between targets and lights.
  • Optimized render settings and UI.

Atlux λ Features:

  • 12 Photo Studio presets with lighting setups, cyclorama backdrops.
  • 14 realistic assets, including studio lighting with rigs and rail systems.
  • 360-degree turntable for product and model animations and visualization.
  • One-click render workflow with simplified interface.
  • 360-degree camera for HDRI creation.
  • Light painting, batch rendering, shortcuts and more workflow efficiencies.
  • Support for Engine versions 5.1 to 5.3 on Windows. (Mac version coming soon.)

Atlux λ is available as a one-time purchase for $349.50 at atlux.ai. The rental option is $29.50 per month.

Unsaid Helps Celebrate Losers for M&M Super Bowl Spot

This Super Bowl, M&M’S, BBDO NY and design studio Unsaid Studio have teamed on a spot that celebrates the losers. The brand is consoling — or even trolling — them with the “Almost Champions Ring of Comfort,” which is studded with diamonds made from M&M’S peanut butter. Its sides feature a three-leaf clover and “2>1,” while a glittering M&M proudly flashing two fingers for second place rests in a bed of rubies on the top. Inside, wearers can find a single peanut butter M&M, sat in a mini stadium bezel.

The 30-second M&M spot stars such runners-up as Scarlett Johansson, Dan Marino, Terrell Owens and Bruce Smith, as well as a close-up displayed on M&M’s Jumbotron in Times Square.

Unsaid Studio was responsible for creating the bling for the ring, and they had fun with it, riding the line between epic and cheesy. As the close-up features the ring alone, floating dramatically in space, its storytelling was to be based purely on the design. The studio used Maxon Cinema 4D and OctaneRender, which helped Unsaid to render the crystals and metals in the ring, as well as produce smoke and additional special effects.

To engineer the most hype-filled-yet-sarcastic atmosphere possible for the close-up film, the team used all the tools in the toolbox: lens flares, shining sparks, diamonds refracting rainbows, embers and smoke. To keep the momentum going through the video, Unsaid pushed the camera movement and the pacing of the edit to be as creative as possible. Sweeping shots and animated lighting are paired with an incredibly dramatic track.

Source Elements DAW Adds Desktop Routing App

Source Elements has added the new Source-Nexus Router to its Source-Nexus Suite DAW. Source-Nexus Router is a desktop application for Windows and macOS that enables unlimited desktop routing, which in turn enables static, flexible audio-routing setups that reduce the need for complex DAW templates. With all the flexibility of Source-Nexus I/O now in a stand-alone application, Source-Nexus Router now makes it possible to route audio outside the DAW so that the routing is active even when switching sessions or the DAW are not running.

Featuring advanced audio routing from any device and channel to any other device and channel, Source-Nexus Router is like a powerful patchbay for applications and connections. Users can mix and match any number of combinations of Source-Nexus devices or other virtual and system audio devices, regardless of the sample rate.

Windows support for similar technologies is lacking in the audio world, and Source-Nexus Suite is an important new toolset for the increasing number of audio professionals seeking to make Windows their main workstation. VP of product Ross Gillard comments that “music production takes place in a vast array of studio and software setups, yet there’s been a notable absence of professional solutions tailored for Windows users in this realm. The unveiling of full routing potential specifically for the Windows environment is truly exciting, and I can’t wait to see and hear what the community does with it.”

Compared to other comparable solutions, Source-Nexus Suite is a fully integrated solution with no overwhelming, complex routing matrices, and setup is designed to be intuitive and powerful. All Source-Nexus Suite applications are compatible with Windows and Mac, including native Apple silicon support.

Source-Nexus Suite is available starting at $11.95 for a monthly, yearly or two-year subscription with no commitment. Subscribers have early access to ongoing updates, including continuous free upgrades and new features and functionality.

Key Features:
• All of the flexibility of Source-Nexus I/O in a stand-alone application.
• Ability to route audio outside the DAW so that the routing is active even when switching sessions or the DAW are not running.
• Advanced audio routing from any channel to any channel.
• Ability to save and load routing templates and presets.
• Ability to mix and match any number of combinations of Source-Nexus and audio devices.
• Like a patchbay for applications and connections.
• Compatible with Windows and Mac, including native Apple silicon support.

 

Loki 2.0: Automatically Enhances and Corrects Archived Content

Filmworkz, owners and developers of DVO software, have released Loki 2.0, a new, automated moving-image enhancement and correction platform.

Loki 2.0 will help content owners monetize their vast libraries with Emmy award-winning DVO tools that can make their content look its best while reducing upfront costs.

With high-quality upscaling algorithms and motion-estimated standards-conversion tools, any archived content can now be easily enhanced to make it available for generations to come. With easy-to-select presets, a preview window, automated batch processing and industry-standard deliverable exports, Loki optimizes archive content for fast distribution in a viable package.

Archive professionals are dealing with an ever-increasing amount of content that must be easily prepared and processed in an automated, cost-efficient, scalable solution that meets current and future industry standards.

They want to be able to monetize their assets without the frustration of high cost and manual labor and with the ability to monitor operations securely from anywhere in the world. Whole libraries can be reinvigorated using tools that will help make content look clean and sharp.

Loki can process large amounts of data automatically. Users can add as many nodes as necessary for urgent projects, or they can use fewer nodes and let the processing run over a longer time frame, reducing costs and helping to prioritize based on deadlines.

Depending on the type of media being added, pros can use easy-to-select presets with options for several types of tape-based media, like D1, HDCAM, Digibeta and others. They can create and link watch folders to specific presets, making it easy to render files based on the type of media.

The power of DVO is ready to deploy in Loki, including DVO Dry Clean, Clarity, Velvet, Deinterlace and many other automated enhancement solutions previously unavailable outside of Phoenix and Nucoda.

Filmworkz product manager Gustavo Mendes says, “Loki can change the way broadcasters and archive holders monetize their content, helping to minimize risks and widening their selection of titles by reducing laborious processing costs. Being overwhelmed by your library selection without a way to enhance it won’t be a problem anymore.

“We can’t wait to see new shows brought to life using our tools being streamed to new audiences all over the world. Current enhancement solutions can be very labor-intensive, with the need to use many operators to work on a single show for several weeks doing extensive checks on the media available. With Loki, you can easily select your files and use presets based on the origin of the media (tape-based, film, etc.), use our preview window to compare before-and-after results and select the server available to render.”

Virtual Roundtable: Storage

By Randi Altman

Storage. Without it there would be no post workflows. It is at the heart of each and every one of them… an unsung hero of sorts. The folks we spoke to for this storage roundtable share their thoughts on how users employ storage, how manufacturers make storage, what it’s like living in a hybrid and cloud world and what’s next for this all-important aspect of the industry.

We hope you enjoy.

BlueBolt’s Tom Mawby 

BlueBolt is an independent VFX studio based in London. It creates visual effects for film and high-end episodic television, such as Robert Eggers’ Nosferatu, Ridley Scott’s Napoleon, The Peripheral (Amazon) and The Last Kingdom (Carnival Films/ Netflix).

How many different types of storage does your facility use and for which parts of your workflow? On-prem, hybrid, cloud, NAS? 
We use one on-prem Pixitmedia GPFS clustered file system for our main storage and another one off-site for disaster recovery. We have SSD storage in our editorial systems for fast playback, and artists have NVMe storage for caching on workstations. 

Some servers use ZFS for storing software and, less frequently, accessed media. We use limited cloud storage; we currently render back directly to our Pixitmedia storage. 

What are the most important things you need from your storage solutions? 
It’s always a case of balancing reliability, speed and capacity with cost. Our main media storage is the Pixitmedia system. It’s the cornerstone of our artists’ workflows, so it needs to be extremely reliable. But this reliability comes at a cost, so space is at a premium, and we have to be vigilant to keep the footprint of our shows under control. 

Another important consideration for us is the quality of support. If something goes wrong with a critical piece of storage, we have to know that the vendor has our back. 

Are you using a MAM? 
No. We use ShotGrid for production tracking and digital asset management, but our media is stored on-prem on our Pixitmedia. 

How are you working with storage in remote workflows? 
Everyone connects in via PCoIP from thin clients to workstations housed in our data centers. All data stays inside our network. This means all of our employees must have very fast and reliable internet connections, but it also means we’re able to keep things simple in terms of having one source of truth for all of our data. 

What do you see as the next storage solutions trend? 
Amazon’s FSx for OpenZFS looks very interesting. We don’t have a huge development team to throw at building custom solutions, so the opportunity to extend our storage into the cloud transparently without having to manage the synchronization ourselves is very compelling. Different storage vendors have their own solutions for extending their file systems, but these often come at higher costs. It’s interesting that Amazon has taken on ZFS. It gives us options to integrate with performant cloud storage at a lower price point than other solutions. 

AJA Video’s Abe Abt 

What kind of storage and data management solutions do you offer? 
We offer AJA Diskover Media Edition, a global data curation and management solution that works with a variety of storage systems from a range of storage vendors and cloud storage providers. We’re working closely together with Diskover Data on the technology to help the M&E industry solve interoperability and accessibility challenges. These challenges are the result of an exponential increase in the amount of data being generated across the industry in recent years and the use of so many different on-prem and cloud storage tools today.  

Are users pushing for more hybrid solutions or sticking with either on-prem or the cloud? 
With the advent of remote production and the widely distributed workforce, cloud and hybrid cloud solutions seem to be the most in-demand among our partners and clients. These solutions lend themselves well to production and post teams working from home or across multiple locations. On-prem storage will always be important for on-set acquisition and the crucial moments just after data is created, but organized movement of data to the cloud for wider viewing and access is becoming increasingly important. 

What do you think is keeping some from working in the cloud? 
One of the main bottlenecks is a lack of familiarity with cloud storage and how it works. If people don’t understand how and where their data is being stored and how they can easily access or view it, then it’s hard to see its true value. AJA Diskover Media Edition helps alleviate those concerns by giving facilities one simple-to-use, web-based user interface that an entire team can use to view, report on and act on their data, regardless of where or how it is stored. 

The other issue that prevents a lot of facilities from transitioning to the cloud is security. This can stem from the lack of familiarity with the cloud, as mentioned above, but there are also legitimate concerns with having your data stored in the cloud and allowing your entire facility/team to remotely access it. AJA Diskover Media Edition helps here by giving teams controlled, nondestructive visibility into their data without direct access. It also allows administrators to control how individual users can view, access and act on their data, ensuring complete data security while also allowing entire distributed workforces to view and interact with their data from any location. 

You touched on this already, but are more people working remotely or back in the office? 
There are certainly more people working remotely now than before 2020. I don’t anticipate we’ll ever transition completely back to the way things were then, nor is it needed. The production world was already pivoting to more remote models pre-2020 because productions and teams were growing more geographically dispersed. Additionally, post work was being increasingly contracted out to multiple editorial, VFX and color facilities spanning the entire globe, and that’s still the case today. Now that the media and entertainment industry has uncovered the benefits of remote work, there’s no going back.  

Do you offer a MAM solution? 
AJA Diskover Media Edition is a global data management solution rather than a MAM. We can certainly integrate with a variety of MAM solutions at a plugin level and make it easier for a large workforce to rapidly index and search powerful MAM solutions from remote locations. 

What is the life expectancy of your storage solution, and is the hardware refresh cycle increasing, decreasing or remaining the same? 
Since we don’t develop a storage system, we cannot define a life expectancy for AJA Diskover Media Edition. It’s a software-based product, available via yearly subscription, with unlimited “seats” per license per year. AJA overall is known for offering long-lasting products and support that far exceeds the life expectancy and market availability of our products, and AJA Diskover Media Edition is no different.

Is real-time sharing of storage between color, editing, audio and VFX something that every post house should have… or at least be working toward? 
Anything that simplifies the storage ontology of a production is a good thing. Giving each phase of the post process visibility into production data is key to a smooth post process and an effective, efficient and economical use of storage and post hours. Losing data, misplacing files and misusing expensive, high-speed cloud storage are all costly mistakes that are very easy to make without proper data management and visibility into your data and what it is doing. 

What are the biggest trends you are seeing in storage? 
Tools that allow entire productions to move to the cloud are exciting and a huge trend right now. At every trade show, I see more and more cloud services vendors and developers. And the faster that data can go from camera to cloud the better. Video is now data earlier and earlier in the process, and all of the software and hardware being created by the manufacturers in this industry reflects just that. 

Rohde & Schwarz’s Duncan Beattie

What kind of storage do you offer?  
We have a broad reach in M&E, encompassing ingest, mastering, playout and monitoring. SpycerNode is a comprehensive range of modular storage systems using enterprise hardware alongside Spectrum Scale from IBM. The deployment that we use is the highest level available and allows us to work with the very best tools that IBM offers. When combined with our hardware, this allows us limitless expansion for both performance and capacity.  

We can also offer intersite data exchange, on-premises instant failover, PAM and cloud connectivity — all-in-all solutions that can scale from media production to broadcast.  

Are users pushing for more hybrid solutions or sticking with either on-prem or the cloud?  
Hybrid workflows have been discussed since before the pandemic. However, in later years the term has been applied to anything where on-prem, cloud and remote have been combined. I believe businesses must concentrate on whatever optimizes the production process for them.  

As we know, it is rare to find two media companies that work the same way, but what they do expect is to be able to maximize their team’s output and investment. Some users insist on having on-prem as their point of origin, whereas others are concentrating on cloud as theirs. We have seen, certainly with broadcasters and large post houses, the desire to keep a large on-prem footprint linked to the cloud. 

What do you think is keeping some from working in the cloud? 
Traditionally, media companies are more comfortable having control over their assets and productions, therefore migrating their entire workflow to a cloud-based solution is a risk too far for many. A further consideration is the uncertainty of price increases or, indeed, ownership of the data. What happens if your cloud provider is no longer viable or operational? A combination of both is ideal. 

Are more people working remotely or back in the office? 
Remote work is absolutely part of the modern production process. However, its feasibility depends on the production company and its specific needs. Film studios, for instance, often require on-site production and post work, especially for location shoots and commercial rentals.  

For traditional post, proxy workflows have helped greatly to reduce the need for high-resolution media to be trafficked off-site, yet the security of moving media in this way is a huge concern and is commercially treacherous. Broadcasters can have remote editing facilities, but they need a great deal of administration and services on-site. Notably, smaller media production teams can derive greater benefits from remote working compared to larger post setups.

Are you offering a MAM as well?  
We have a product called SpycerPAM, which is built on a proven framework for the production process controlling on-the-fly permissions and featuring a project-based workflow that allows a templated approach to production and editing. SpycerNode as a storage solution is also MAM-agnostic, working with all asset management systems. SpycerPAM can also work alongside any MAM in a media workflow, allowing both PAM and MAM to coexist. 

What is the life expectancy of your storage solution, and is the hardware refresh cycle increasing, decreasing or remaining the same? 
For over 90 years, our commitment has been to quality and reliability in all our products, and our dedication to broadcast and M&E is no exception. We have a minimum expected life span of five years, and with our storage solutions, we can offer a case-by-case extension. Many storage providers will have a hard cut-off at this point, forcing a complete reinvestment.  

What customers now need is an increased ROI with an upfront commitment to SLA or licensing costs. Therefore, I would say that manufacturers who can help extend life span and offer a predictable opex support cost have an attractive proposition. 

Is real-time sharing of storage between color, editing, audio and VFX something that every post house should have… or at least be working toward? 
Absolutely. When we made the move to file-based workflows, central storage was imperative. Tracking files, reducing duplication and managing production all helped make sense of the chaotic nature of content creation. As data rates have increased, some areas have become more siloed, and this almost reintroduces the issues for those that are not using shared storage.  

Files must be copied or moved from one pool to another. One of the key aspects I love about our storage solutions is being able to combine our active file management and SpycerPAM products alongside the ability to mix flash and spinning disks in the same cluster. This means we can host all of the above inside the same namespace, simplifying the management of the most valuable asset: data. 

What are the biggest trends you are seeing in storage? 
One of the main trends that I see currently is mixing disk technologies in a seamless fashion. With rising data rates, there’s a growing use of NVMe, SSD and HDD in the background of certain workflow areas to ensure optimal performance. Another enduring trend is the use of the cloud. However, there have been some recent cases where the long-term cost of cloud storage is nearly four times that of a capital expenditure (capex) investment. 

Hammerspace’s Molly Presley 

What kind of storage do you offer?  
Hammerspace offers high-performance NAS storage coupled with automated data orchestration to unify data into a single, global data environment. This allows organizations to have a single global file system that spans all storage silos, which users and applications can reference no matter where they are. In other words, users can see and access data stored anywhere. And when that data must be moved for rendering, collaboration, distribution or archive, it all happens in the same global file system.  

Hammerspace gives organizations the flexibility to use data anywhere, with users, compute and applications everywhere. Operations are more efficient because organizations don’t need to store multiple copies of the same data, can use compute resources that are most cost-effective or readily available, and can hire resources anywhere and give them access to collaborate on projects globally.  

Are users pushing for more hybrid solutions or sticking with either on-prem or the cloud?
Users are pushing to be able to use the tools they prefer with the flexibility to work anywhere. They need the application stack to empower them with high-performance local data access, whether the data is actually stored locally, in a remote data center or in a cloud. 

IT administrators, on the other hand, are pushing for the ability to design for flexibility. GPUs are difficult to come by and expensive to procure and run. Workload requirements are often changing as organizations grapple with power availability, access to talent, AI readiness and time to market with new content. IT organizations want the flexibility to run in multiple locations.  

What do you think is keeping some from working in the cloud? 
Working in the cloud is largely inhibited by the need to redesign applications and workloads to be cloud-native. The complexity of such a redesign has kept many organizations from being able to take advantage of cloud capabilities as quickly as they would like. The more standards-based, cloud-native applications that technology vendors provide, the greater the flexibility to run anywhere. This includes allowing data architectures to be standard-file interfaces in the cloud just as they were in the data center. It also means deploying anywhere without changing user workloads and putting data in motion to use it where it is most efficient and beneficial. 

Are more people working remotely, or are they back in the office?



 
The majority of users we work with are still remote, or they have the flexibility to work in the office at their discretion but don’t do so consistently. 

Are you offering a MAM?  
We do not provide a MAM. Instead, we work with MAM companies to enrich the metadata in our file system and provide tight integration between the two technologies. This provides the ability to orchestrate data where it is needed, automatically, using the traditional MAM already in place in the workflow. 

What is the life expectancy of your storage solution, and is the hardware refresh cycle increasing, decreasing or remaining the same? 
Our solution is completely software-defined and hardware-agnostic. We would expect the software to stay in place for over a decade because it provides the file system interfaces to applications, the knowledge of which data is located where and robust metadata that workflows depend on. We are agnostic to hardware updates, and we make it easy for organizations to refresh or move to new infrastructure without interrupting the workflow. This means organizations can be more elastic and can have a wide variety of infrastructure without the disadvantages of infrastructure-bound data silos. The Hammerspace Parallel Global File System spans all different types of infrastructure. With this flexibility, hardware refresh can be done on a more flexible schedule and without user downtime or complex data migrations, so they will likely occur more frequently. 

Is real-time sharing of storage between color, editing, audio and VFX something that every post house should have… or at least be working toward? 
Yes, definitely. The goal to reduce copy management and increase collaboration is something every post house should be working toward. It will reduce infrastructure costs and IT management overhead while also accelerating speed of collaboration. The MovieLabs 2030 paper has touched on many of the benefits of eliminating silos and accelerating collaboration. 

What are the biggest trends you are seeing in storage? 
Standards-based solutions instead of proprietary file systems; isolated metadata and infrastructure-bound silos are becoming more prevalent. The standards-based approach is helping organizations run with flexibility on any data center hardware or cloud. We are also seeing demand for solutions that overcome data silos and make content easier to move to available compute resources and remote workforce. Data orchestration has become a front-and-center topic in most storage discussions we are having. 

Symply’s Keith Warburton 

What kind of storage do you offer? 
Symply is an on-prem-first storage technology company, but with hybrid cloud solutions that enable organizations with critical workflows — such as media and entertainment — to get the best out of cloud storage and services.  

We offer a portfolio of solutions — high-performance Thunderbolt RAID, an extensive range of desktop and rackmount LTO, collaborative storage, object storage for production content protection and long-term archive, and cost-predicable public cloud with zero egress fees.  

Are users pushing for more hybrid solutions or sticking with either on-prem or the cloud? 
It really depends on the user and the workflow. The truth is that for most users, there is usually a mix of on-prem with some form of remote access, complemented with cloud storage and services.  

A lot of users have a large investment in on-prem solutions, and Symply can help those users by deploying products that enhance their workflows that involve moving content to the cloud.  

What do you think is keeping some from working in the cloud? 
The are several reasons. First, there is a lack of internal cloud/IT expertise combined with a lack of cloud-native tools that are required for achieving the desired workflows.  

Second, there is a problem of resources. This could be the customer’s local connection to the cloud or even the lack of correct resources in the cloud.  

Users expect cloud resources to be limitless, but this is far from the case when specialized compute and GPU resources are required.  

Third, and perhaps the elephant in the room, is cost, especially the lack of cost predictability. Finally, for some customers, there is the problem of data governance and security. 

Are more people working remotely or back in the office?   
We are seeing that most media and entertainment companies are offering a hybrid model of working, usually weighted for the majority of the working week in the office. Obviously, this varies depending on the services that a particular organization offers. It is difficult for a DIT to work from home. 

Are you offering a MAM as well?  
We do offer both an integrated project manager and a MAM with our Workspace XE collaborative storage solutions. It makes the user’s life simpler to have everything integrated into a single appliance. But we realize that there are many different MAM platforms on the market, and we support customers who want to integrate with their preferred MAM solution, especially in relation to our Workspace XE and on-prem object storage.  

What is the life expectancy of your storage solution, and is the hardware refresh cycle increasing, decreasing or remaining the same? 
The life expectancy of our storage solution is five years. I do not see organizations investing in storage solutions that have less than a five-year life expectancy. At Symply, we know that users are trying to increase the time between storage refreshes, and we are engineering many of our solutions to be upgradable and scalable to make that possible. 

Is real-time sharing of storage between color, editing, audio and VFX something that every post house should have… or at least be working toward? 
I believe that increasingly, the days of the sneakernet are behind us, and most facilities large or small are using shared storage solutions to collaborate and allow hybrid working.   

What are the biggest trends you are seeing in storage? 
For production and post, flash storage is becoming more cost-effective, and all our new Workspace XE collaborative storage solutions are designed to support both flash and hard drives. This allows us to offer cost-effective hybrid storage products to support the needs of our customers.  

Organizations, especially content owners, are giving more thought to where their content resides, whether it’s in large LTO libraries or public cloud. There is a lot of truth in the “cloud boomerang,” primarily due to the financial costs but also because of data governance. Tape libraries, while very cost-effective, make it more difficult to get more value from archived content. On-prem object storage can offer a long-term archive strategy with enhanced metadata to help content owners better monetize their assets.  

High-capacity portable RAID storage and LTO are still very much in demand, despite a trend toward delivering content via the internet and better cloud connectivity. The 100TB-plus Shuttle RAID is still an essential part of content acquisition on-set; it is an easy solution when many projects are still run ad hoc. LTO is still the most secure and cost-effective solution for creating archives of the original camera data and metadata, backups during a project and the final archive of the finished project for delivery and posterity.  

Finally, it is worth noting that the supply chain issues that caused problems worldwide after the pandemic are not all behind us. A perfect storm of pandemic, inflation and geopolitics is continuing to wreak havoc. The constant increases in cost and supply shortages on SSDs have been apparent for some time now, but HDDs have also increased steadily in price over the past few quarters as demand drops and fears of key component shortages loom.      

Goldcrest Post‘s Roman Rossell 

New York’s Goldcrest Post is full-service post facility offering color correction, visual effects, DI conform, sound mixing, ADR, screening rooms, digital cinema mastering, offline editorial and post office space, including consultation from project inception to delivery.

How many different types of storage does your facility use, and for which parts of your workflow? On-prem, hybrid, cloud, NAS, etc.? 
We currently use three storage solutions within our facility.  For our high-end, demanding production workflows, we boast OpenDrives Ultimate, an NVMe storage platform providing enough bandwidth to handle multiple streams of 8K video for our color suites and conform artists. 

For back-office and Nearchive storage needs, we use Quantum StorNext with Xcellis appliances segmented from each other by purpose and per security protocol. For offline editorial, there’s Avid Pro Tools and Avid Nexis F-series storage. 

While we have cloud storage available to us, we aren’t yet realizing the return on investment both monetarily and workflow-wise as compared to our on-prem solutions. We do not use cloud storage unless specifically requested.  

What are the most important things you need from your storage solutions? 
Reliability. Uptime.  Redundancy. 

Are you using a MAM? 
No, not any traditional MAM tool. We instead track our media assets with software that indexes all our storage silos, allowing us to know exactly where, what, how long and why we have any given piece of digital content. 

How are you working with storage in remote workflows?
Our priority is the safe keeping of our clients’ media assets, so all media remains within our ecosystem. Our remote users are set up to enter specific points using popular software solutions — Jump Desktop, Splashtop and HP RGS — as well as point-to-point hardware with Amulet Hotkey, allowing for higher-bandwidth, lower-latency connectivity to our on-prem workstations by our remote editors and artists.  

What do you see as the next storage solutions trend? 
The reduced use of proxy media is a trend that I’d like to see the storage industry continue moving toward so that high-speed storage solutions become more accessible and cost-effective.  

I look forward to editing 4K footage using hand gestures while wearing a VR headset. 

Do you have a wish list for those making storage tools? 
I’d like to see the industry work more closely with the various software vendors so that advancements in relevant technologies are used more effectively. 

What haven’t I asked that’s important? 
Relationships between end users and vendors are important. I place a high value in being able to interact with manufacturers before, during and after the initial point of sales. Interacting with vendors’ engineering teams and fellow end users allows for maximizing the potential of equipment in use and for better diagnosing issues and moving toward workflow improvements. 

Facilis’ Jim McKenna  

What kind of storage do you offer?  
We configure on-prem solutions with remote access, and we also offer a cloud-based backup and archive. Hybrid on-/off-prem workgroups are very popular right now. For NAS solutions without the benefit of software-managed connectivity, it’s very difficult to work remotely, as network file systems and VPN slow any access to a crawl. Through the Facilis Shared File System, WANLink offers the easiest and fastest way to access your on-prem storage from anywhere, securely and without the need for a VPN.

Are more people working remotely or back in the office? What do you think is keeping some from working in the cloud? 
We have some customers who are still (or in some cases, once again) all on-prem, and that will not change. Others have moved to on-prem with remote editors, sometimes across the country. Regarding cloud, I have had facility managers come to me and say that they’ve been able to move their entire operation to the cloud, except for the video department. It’s very difficult to sustain a fast-paced workflow, especially with on-prem workers who expect high-speed access to project data, when everyone is going outside the facility to access data.

Are you offering a MAM? If not, do you suggest specific MAMs your solutions work well with? 
We offer a MAM called FastTracker. It’s a fully integrated solution for designing proxy workflows, tracking project data and streamlining workflow with automations. FastTracker is included at no additional cost with our minimal yearly server support. And unlike some MAMs, we don’t have any limitation on seats, even when acting as a web server for browser-based access. FastTracker offers transcription, AI tags on timecode markers and full tape and cloud backup and archive management when used with our Object Cloud and Object LTO products.

What is the life expectancy of your storage solution, and is the hardware refresh cycle increasing, decreasing or remaining the same? 
Hardware refresh cycle timing has increased. As server hardware gets more capable, users can hold onto servers they bought seven or eight years ago and still work in the newest 4K formats. Since capacity increases are outpacing many facilities’ needs, many of our customers are refreshing with similar-sized units. In the past, the normal trade-in would be several times the size of the original to meet the demand. We never purposefully obsolete our systems, nor do we ever drop support for our customers using them.

Is real-time sharing of storage between color, editing, audio and VFX something that every post house should have… or at least be working toward? 
We believe every facility can benefit from centralization, management and protection of a shared storage system with industry-leading security features. Our Smart Access Rules allow facility managers to lock down critical files during the post workflow without impeding the progress of the project. To be sure that no vital data is ever compromised, files can even be protected from deletion by certain users or groups based on type or extension.  

What are the biggest trends you are seeing in storage? 
Solid state storage is becoming an important layer in many facilities, as evidenced by the popularity of our FlashPoint server models. We can supply up to 12GB/sec to our shared file system clients from a single 4U server enclosure with a capacity of as little as 48TB and as large as 368TB at a price that will surprise anyone looking at premium storage servers. To exploit this performance, we can deliver over 4GB/sec to a single Mac client for 8K color grading, and we offer up to 64Gb Fibre Channel for Windows workstations that can’t reach their potential with Ethernet-attached storage.  

Blackmagic’s Bob Caniglia 

What kind of storage do you offer?
Blackmagic has four lines of network-attached storage products: Blackmagic Cloud Store, Cloud Store Mini, Cloud Pod and the Cloud Dock.  

The Blackmagic Cloud Store is a fast, high-capacity network disk designed to handle large media files and lots of simultaneous users. The Cloud Store Mini offers the same capabilities as the Cloud Store but in a built-in, compact, rack-mount design. The Cloud Pod syncs and shares media from any USB-C disk with Dropbox and Google Drive. Finally, the Cloud Dock lets you share up to four independent drives over a high-speed 10G Ethernet network.

Are you offering a MAM with that? If not, do you suggest specific MAMs your solutions work well with?
We do not sell a stand-alone media asset management product, but a lot of our products come with software tools that help manage media use. For specific MAM solutions, we do not suggest any particular one. There are so many of them out there, and so many different ways people use media, that it would be hard to suggest just one. 

Are users pushing for more hybrid solutions or sticking with either on-prem or the cloud?
I think the days of only on-prem are long gone. I am sure there are still some places that work on projects that are huge security threats and won’t ever go outside the office walls, but the majority of the world is comfortable with a hybrid model.

What do you think is keeping some from working in the cloud?
I think most post pros are comfortable with the idea of working in the cloud by now. Security worries and setting up admin controls (things like who has access to what and when) are what has kept some folks from moving over fully by now. But there are answers to those needs, and even the most complicated remote productions are using the cloud.  

The Lord of the Rings: The Rings of Power is a great example of how a huge production with a huge security risk used the cloud. VFX producer Ron Ames and Company 3 had to work with 786 hours of footage (24,659 takes) that were shot and worked on across three continents. No matter where the post people were based, they were working on the same media stored in the cloud. The great thing is that everything they built their workflow around is affordable enough for everyone to use.  

The shared storage cloud workflow is also something that made people nervous, but less so now. People are comfortable working with network-attached storage, and the past few years have seen NAS storage built directly for film and video work. Because NAS storage syncs and shares media with services like Dropbox and Google Drive, works with proxy workflows so timelines and media can be shared in minutes and offers low-latency file access, a lot of the worries for cloud and storage have been answered.  

What is the life expectancy of your storage solution, and is the hardware refresh cycle increasing, decreasing or remaining the same?
If you blink you will miss another new storage product launch. I remember when we put out the first Ursa camera, CFast cards were the new amazing thing. Now the original CFast media looks quaint. The refresh cycle for storage in general is going to keep increasing. When you are building out your storage network, you need to keep in mind the ability to add to it as storage gets faster and bigger.  

That doesn’t mean customers should just sit back and assume they will rebuild every two years. Look for storage that is designed to be added to — not a one and done — because one and done doesn’t exist. This is where knowing a company’s history comes in handy. Find a vendor that has a track record of always improving, and make sure that company doesn’t force customers into a closed world where they have to pay for every small upgrade. 

Is real-time sharing of storage between color, editing, audio and VFX something that every post house should have… or at least be working toward?
Absolutely. What is the point of a cloud workflow if only one group of people can talk to one another quickly and efficiently? A proxy workflow can share a whole timeline and its media in minutes. And storage is available that is fast enough to take full advantage of 10G Ethernet ports with multiple users connected. All these things are available now at affordable prices for sharing large media files quickly between editors, colorists, audio engineers and VFX artists.

How do you see AI and virtual production using storage?
AI post tools like Resolve’s Neural Engine are being used for all sorts of different projects, and those are triggered by an actual live artist using those features on media pulled from shared storage. Virtual production is absolutely a huge storage consumer to plan for, and that is just going to become more important as more small- and medium-size projects use virtual production. With virtual production, you have to plan for a storage network that is fast and big enough to handle the huge amount of files associated with real-time VFX and live-action shooting. A good storage workflow for virtual production is a must-have.  

What storage trends are you seeing?
Real-time cloud collaboration across all aspects of post. Cloud-based storage networks will keep going down in price, and the argument that on-prem storage will always be cheaper will no longer be accurate. And everyone will become if not an expert then at least knowledgeable about shared storage.   


Randi Altman is the founder and editor-in-chief of postPerspective. She has been covering production and post production for more than 25 years. 

Lost Planet and Cabin: Storage for Spots

By Alyssa Heater

The world of commercials and TV spots is known for quick timelines and multiple levels of sign-off. Storage plays a big factor in not only backing up media but facilitating collaboration with multiple stakeholders. 

Lost Planet Editorial’s Kenji Yamauchi elaborated on the storage needs for TV spots, referencing the Google DeepMind piece he recently cut for YouTube. 

We also spoke with Cabin’s Andrew Ratzlaff to learn about storage for the recent Capital One holiday spot he cut, which features John Travolta as a suave, dancing Santa Claus. 

Lost Planet Editorial 
Tell me about your experience as an editor. How did your career path lead you to Lost Planet Editorial? 
My path is a little unusual. I didn’t go to film school; I went to advertising school in Sao Paulo. I was pretty good with video, so to help pay for my tuition, I worked in the video department at the university. That’s how I started editing – I would edit classes. I had a week to edit a class; it was the easiest job in the world. 

After graduating, I worked as an in-house editor at an agency. After three or four years, I talked to my boss about my interest in becoming a commercial editor. I was editing commercials at the agency, but it was all rough edits for internal meetings. She suggested that I work at a proper editing house, but at the same time, that agency liked my work and didn’t want me to leave. I said I would work there for another year if I could then move on to a post house in either LA or New York. I provided a list of facilities, and Lost Planet was my first choice. My boss had a connection there and was able to help secure a role for me as an assistant. I had nothing planned. An opportunity appeared, I was open to trying it, and if it didn’t work out, I would come back home. 

Tell me about the workflow for the YouTube Google DeepMind spot. It incorporates concert footage, interviews and even portrait-oriented cellphone clips. How did you balance editing these different types of media together? 
To be honest, it’s nothing new for us. Hank Corwin, the founder and owner of the company, is known for that style. We all learn and grow from him and understand how to pull emotion from any kind of footage. There is no correct type of footage to tell a story. It doesn’t matter if it’s shot on a Super 8, an iPhone or an Alexa; you can mix the different types as long as you make sure they fit and give you the right feel. And our approach to working with storage is very reminiscent of that. 

Tell me about the storage component of your workflow. 
Before the pandemic, our storage was primarily local hard drives. As an assistant, I made countless shuttle drives for editors, and we developed many techniques to ensure everything was up to date. We developed workflows with the editors to ensure things didn’t get locked in between a shuttle drive and our internal server. Then the pandemic threw everything out the window. For at least a year, we were trying to figure it out. All of a sudden, the assistant had to figure out their own computer, the editor’s computer and the server, and they all had to be in sync. At the same time, it couldn’t hinder the editor’s creativity or delay uploads. 

 Local storage is much easier to work with, especially in Avid and Premiere, because footage is uploaded to the server, appears on the screen and is ready to go. But we develop new workflows to set up the right storage for each job. This YouTube job is a perfect example because Lost Planet senior editor Charlie Johnston started the project then had to move on to another job. I was just finishing a job myself, and because we have similar styles, I was able to take over for him. 

 We are set up in that way. We have the local server, a local drive and LucidLink cloud storage, and in the end, we use a combination of all of them depending on the project. If I’m working by myself remotely, I work straight out of LucidLink. If I’m working at the office, I work locally out of the server. But if I’m working remotely out of Avid, I need to work off of a local drive for speed so I can maneuver files. We’ve created a workflow and language between the assistants and editors to ensure things go smoothly and we don’t lose anything in the process. 

 Was security a big factor in determining what solution to use for these workflows? 
 We always account for that, especially for this YouTube/Google DeepMind job because they were announcing a new product. This happens with many of our clients, especially with tech companies like YouTube, Google and Apple. When we work with them, security is very important. Everything is encrypted, watermarked and time-stamped, so it cannot be shared with anyone beyond the team. The main component of security was in the cloud storage because when we’re working locally, either with our hard drive or the local storage in the company, that’s pretty much taken care of. 

 What is the collaboration process with the agency and the different stakeholders? We usually work directly with Google and YouTube. Sometimes there is an agency, but most of the time, we work directly with them. And it’s great working with them because they’re very collaborative. Don’t get me wrong, there are still challenges, but it feels good to work on a project where the client on the other end understands the value in our ideas. Then from there, we can explore other ideas — breaking it apart, putting it together, combining it with something else, elevating it or pulling it back. We can try something wild without getting punished for it, which is the best feeling. It feels like a garage band, in a way, which is very cool. 

That’s awesome — you have some creative freedom to explore. Anything else you would like to touch on? 
Never underestimate the power of how storage can help or hinder your process. We had problems with that years ago. Planning the right storage is like making sure you’re getting the right tires for your car. 

Cabin Editing Company 

Tell me about your experience as an editor. You have worked in both film and episodic, but it seems like the TV spot world is your bread and butter. 
I have been doing spots for almost 15 years now. I started as an assistant in Boulder, Colorado, and now I’m cutting in LA, where I have been for around 10 years. I bounced around in New York for a bit too. I kind of fell into what I’m doing. When I went to film school, someone said, “Hey, you’re pretty good at this.” And I said, “I am? Okay. Well, I guess I’ll keep doing it.” So I’ve been working and making my way with it over the years.  

From your perspective as an editor, tell me about the typical workflow on a TV spot from preproduction to shooting to post to delivery. 
As an offline editor, it’s very simple. Raw media is huge now; you have terabytes of it after a shoot, but when we’re doing offline editorial, it’s all transcoded into small files. You go from terabytes and terabytes to maybe a couple hundred gigs that fit onto a drive smaller than your phone. If I’m working remotely, I’m using a tiny hard drive plugged into my laptop, and I can do the whole thing from there. When we’re working at an actual post house, we have a much larger storage device to handle all of the different edits happening simultaneously. And that’s basic shared storage that any facility probably has, but it allows us to handle countless projects at the exact same time. For me as an individual, it’s really simple: I get my drive, I do my edits and then I send an EDL out to the finishing house; they deal with the massive amounts of storage and media from there. 

What is the collaboration process like? Or are you primarily working solo in editorial?  
I pride myself on being collaborative. It takes a whole team to make any spot, whether it’s a 15, a 30, a 60, a 90… whatever. I don’t think one person alone can say “This is what it’s going to be.” It takes a lot of minds and a lot of thoughts. For the Capital One spot, I worked closely with Bryan Buckley. He’s been doing this much longer than I have, so it was great to talk to him and get his point of view as the story unfolded. Once Bryan and I were happy with something, we’d take it to the agency and go through that process again with them to explore, shape the edit and figure out exactly what was going to make it the best for their needs. Everybody worked together to make it the best that we could.  

What storage system do you use to support this workflow? Is the same solution used for every project that goes through Cabin, or does it vary per project?  
When we’re in the office, we use Facilis, which is a common shared storage. I don’t have a ton of knowledge on the actual inner workings, but it works great.  

For the Capital One piece, what was the timeline? How quickly did you have to turn it around? 
I was on the project for about two weeks, but the whole project itself went on for five weeks. Once we got everything into a good spot, an assistant would come in and tweak little things here and there, try a different take or try different music. This job seemed comfortable in terms of the amount of time that we had upfront before we had to share anything with the agency. It was a long schedule in terms of finishing because there were a ton of VFX with the city skyline and all of the magic. I’ve had projects where they give me a day or two with the footage, and then the agency wants to walk in and review everything, which is a lot harder because I don’t have the time to really get to know the footage and develop a point of view. Capital One and GSD&M were great in allowing us the time to make the best commercial it could be.  

The spot has John Travolta dressed as Santa Claus, disco dancing and making references back to Saturday Night Fever. How did this content influence your process?  
The fact that this project is an homage to Saturday Night Fever gave us a blueprint to work from. We were not trying to reinvent the wheel, but we wanted to match some of the shots. We wanted it to have the same vibe and be lovable and humorous. I think that makes the project a little easier to do because we have the stepping stones. For this project, the process was streamlined, and it was all just about rhythm and pacing.  

Is there anything else you’d like to touch on? 
I was really thrilled with how it all turned out. I thought the VFX were amazing, and it was a pleasure to work with Bryan Buckley. I had worked with Capital One on a Taylor Swift spot last year, which was another big VFX project. It was overall a great experience. 


Alyssa Heater is a writer and marketer in the entertainment industry. When not writing, you can find her front row at heavy metal shows or remodeling her cabin in the San Gabriel Mountains.

BBC and 3 Ball Media: Storage for Unscripted Series

By Alyssa Heater

Unscripted reality series and the sheer amount of footage captured for each episode poses unique challenges, especially when it comes to storage.  

To find out more about how these shows manage all that data, we first spoke with BBC Worldwide post producer Chris Gats, who works on the unscripted series Life Below Zero and its multitude of spinoffs. The remoteness of the series’ shooting locations is a driving factor in its storage needs, and Gats details how tripling the storage plays into the workflow.

We then sat down with 3 Ball Media Group (3BMG) EVP of post Neil Coleman and director of production and operations Scotty Eagleton to learn about storage solutions for high-volume footage series — including one with 28 cameras running 24/7. 3BMG is the force behind reality series including Iron Chef: Quest for an Iron Legend and College Hill: Celebrity Edition. 

BBC Worldwide

It seems like the bulk of your work has been in the reality TV/unscripted realm. What inspired you to go this route?  
I kind of fell into it. I started as a production assistant for the company [Stone & Company Entertainment], that which produced game shows and docu-reality shows like Loveline, The Man Show and Shop ‘Til You Drop. I just grew up through post in that company, and it became my niche.

Tell us about the typical workflow on an episode of Life Below Zero, from shooting to post to delivery. 
Reality shows have a tremendous amount more of everything. A scripted show might go out with one camera and one microphone. We go out with several cameras all filming the exact same thing. One hour of footage in the scripted world is several hours in the reality world. That’s the main difference… more cameras and more microphones on-set. We film it, we archive it to hard drives, then the hard drives make their way down to Los Angeles. The post team take it from there.  

Specifically on Life Below Zero, because the show is so remote, and it is very difficult and expensive to get the crews in and out, we have them make three copies of their footage. It’s on in triplicate, just in case FedEx loses one, a producer’s bag falls into a river, or whatever it could be.

The producer will hold onto one copy, then the two others go to our production hub in Anchorage. Once the footage is handed off, they immediately send one of the hard drives down to post in Los Angeles, and the other one is archived onto their hard drive. So now we have the footage flying to Los Angeles on a hard drive, it’s in Anchorage on a hard drive, and the producer has their a copy. And if we lose one, we’re fine because we have two back-ups.  

Once the footage gets to Los Angeles, we archive it to an LTO tape right away. Once we get it on LTO, we tell production up in Anchorage and then they can wipe their two drives and send them back into the field. It’s always on a constant rotation. In Los Angeles, we also copy it into our computer archive system and send the drives back up. Throughout the whole run of post, Anchorage has a copy of our media, it’s in Los Angeles in our edit system, and we also have it archived on an LTO tape.  

What about storage for the post process?  
For most of the shows I run post for, we use the post facility FotoKem. They supply our offline equipment, and we go to them for some online and audio mixing. They invented an all-in-one, on-set color correction and archival machine called nextLab. It allows us to go out in the field, upload our footage, add our LUT to see what the shots are going to look like, automatically sync the audio, and even archive to LTO right on-set. It can also timecode and distribute dailies.  

It also serves as storage by ingesting and archiving full-resolution media. We have at least a petabyte of 4K media in it now. From there, we take the footage, down-rez it, and shoot it over to an Avid Nexis for editing. So, we have two things: the HD/UHD archive in the nextLab, and the low-res on a Nexis.   

Did COVID impact your storage process with hybrid and remote workflows?  
We kind of lucked out with the Life Below Zero shows. It takes a year to film these. We film three episodes at a time, in part, because of hunting seasons and crazy weather. Also, when we send humans out to these remote locations, we don’t want them to be there too long. The show features four people an episode, so we film at four different locations. It takes a while to film, but as we start to film and stockpile, we then send to post. 

We were in the process of shooting Season 9 when production was shut down. At this that time, FotoKem and BBC got together to figure out a solution. Both bought computers to help facilitate a remote post workflow. We were only down for one calendar week. We kicked everybody out of the office, then the following week, I was calling everybody to come back at a scheduled time frame to grab their computer and anything else they needed to take home. We got everyone set up, and then I got together with FotoKem and we started the remote workflow. The next Monday, post was up and running entirely from peoples’ people’s homes. By the time we were about to run out of footage to edit, the government let us go back to filming.

We were one of the first to get to go back to work because of our remoteness. We sent four people up to film the episodes: a DP, a producer, a DIT/PA-type person, and then we a “safety manager.” The safety manager handles the cooking, and shelter and brings a gun in case of bears or any danger. It’s crazy what they do.

Will you touch on why security played a factor in selecting the right storage solution for Life Below Zero?
Nat Geo is now owned by Disney, and they adhere to the highest security standards. FotoKem dealt with the highest level of security possible from Disney and did whatever they needed to dowas necessary to make our network secure. When editors are working from their homes, they log into FotoKem’s closed network, and are, in a way, using FotoKem’s internet connection to access the footage. Several years ago, we were exploring how to work remotely, and then we expedited it due to the pandemic. Now we’re remote, and it’s working great.

3BMG

Tell us about your experience in the reality TV/unscripted arena and what inspired you to explore this niche within the industry.
Neil Coleman: My background is as an editor. I started editing commercials and music videos at a post house, and one of our clients was the talk show, The Jenny Jones Show. We used a linear editing system, the Grass Valley CMX, and that’s how I learned. I went from there to The Oprah Winfrey Show where I spent 16 years, starting in editorial then moving to project management. When The Oprah Winfrey Show ended, there was not much television, media or entertainment work in Chicago, so I moved out to Los Angeles and started working on The Jeff Probst Show. The post supervisor for that show had worked at 3 Ball Media Group (3BMG) previously and they were looking for somebody to take over their post department. I’ve now been at 3BMG for about nine and a half years. Prior to that, I never worked in unscripted reality, just talk shows, music videos and commercials. The nomenclature is different, but the work is all very similar.

Scotty Eagleton: When I first got into the industry, I started in the music video space but felt it was quickly going nowhere, so I accepted a job with 3BMG. I worked on a few reality shows, then left 3BMG and shifted my focus to scripted and I bounced around as a project manager on various sitcoms for a long time. I wanted to have a role with more longevity, so I went back to 3BMG, helping with deliverables and working with the production teams.

Once Neil and I started working together, and then our production services business, Warehouse Studios, came about, I started learning more about post workflows. That knowledge has really helped me collaborate with Neil and find the types of people required to produce content and ensure our clients’ needs are met.

3BMG works on big reality titles, including Iron Chef: Quest for an Iron Legend and College Hill: Celebrity Edition. Tell us about the typical workflow on an episode of an unscripted TV show.
Coleman: Both of those shows shoot a lot of content, both on fairly concentrated timelines. Iron Chef: Quest for an Iron Legend shot eight episodes in eight days — one episode per day. And they had something like 28 cameras, which was very intimidating for us because the turnaround was very tight. We can absolutely handle that kind of footage, but turning it around that quickly… there is a lot going on. College Hill was also just four weeks, and it had around 28 cameras as well, but they were running 24/7 because they’re embedded in a house, and it’s more documentary-style. There’s just a huge volume of footage.

Our typical workflow, regardless of whether it’s local or across the country, starts with how we receive the footage, which is typically electronically. We use Signiant’s Media Shuttle, which allows us to receive the footage much faster than the traditional way, where everything is offloaded by the DIT onto an external hard drive and then FedEx-ed back to us. For argument’s sake, if you shoot on a Monday, the footage will be ready on Tuesday, and because you’re FedEx-ing it, that means you don’t receive it until Wednesday. If you’re transferring electronically, and your speeds are high enough when you finish shooting on a Monday, then you start receiving the footage right away.

In the unscripted world, media is edited on Avid, then we use AMA (Avid Media Access) to connect to the raw source footage and then transcode for the proxy media. It’s a tried-and-true workflow, but it requires a large volume of people because it’s a linear, single-file workflow. If you have one machine, it does one file at a time.

We use something that’s more enterprise-level: a transcoding tool called Telestream Vantage. Our setup transcodes up to 20 files simultaneously — much faster than an assistant editor’s Avid would. For example, we got 28 cameras in for College Hill on a Monday. By Wednesday morning, the footage was already grouped, prepped and ready for our story producers and editors. So it took around 36 hours total, and those are huge shows.

When we go to online, mix and color, we’ll AMA and consolidate the media. When we are finished with the low res, we’ll create the final 4K or HD media. One of the advantages of using Telestream is when we’re creating the proxy media, we’ll bake a LUT into the log footage so that it looks correct as a proxy. If you AMA it, you have to rely on the Avid to add the LUT. If you’re grouping multiple cameras, that LUT requires processing power just to be able to display it in Avid, whereas in Telestream, it’s baked in, and you don’t need the Avid to do anything.  

What is the collaboration process like with other creatives and stakeholders? Do you have anybody that you’re working with outside of your team, like on the production side? 
Coleman: We’re preparing to begin a show for Hulu after the first of the year. As soon as the show is green lit, we’ll start speaking with the executive producers about what their needs are, how they envision creating the show, what types of cameras they want to use, etc. Those are all very creative conversations about how the DP likes to work. The linchpin of that whole dynamic and interaction is the DIT — the digital imaging technician — who is the point of contact between what is acquired in the field and what ends up being transferred to us. It is imperative to have strong communication and a good relationship with that person, and also for that person to have strong communication with the camera team and the producers out in the field. 

Communication is key for everything. Once we’re in post, we have conversations with the post teams about their needs, how they like to work with graphics, music and all those creative elements that require a technical foundation. We like to cater to those needs while still within our existing infrastructure because we have multiple shows going on simultaneously, and we have a lot of shared resources. We want to make sure that everybody is moving in the same direction yet have some control of their own individual shows. 

Neil Coleman

Do you use the same storage systems across all the projects that come through 3BMG, or does it vary per project?  
Coleman: It varies a little bit per project, and that often has to do with restrictions or mandates that come from the network. For instance, we have to go through a security audit for Amazon when we work on their shows. They have, in a way, set the bar for what we provide for everybody: security in the facility, encrypted drives, ensuring that we have the correct IT infrastructure and beyond. Regarding storage, we primarily use Synology Network Attached Storage (NAS). We also use 45Drives’ NAS and Storinator. We also use the Nexis for Avid, as well as archive to LTO. Then some networks, like Amazon, have us upload directly to their S3 in the cloud.  

Was security a big factor in determining what kind of storage that you use?  
Coleman: With the Amazon security audit, we made sure that everything we are using hit their specs and met their requirements in order to be approved to work on their show. For the most part, everything that we already had hit those specs, we had to just make sure they were configured according to their requirements.  

Has the evolution of hybrid and remote workflows affected your storage needs at all?  
Coleman: I don’t think so. The way we do it here, and the way I understand it, it’s fairly similar to the other unscripted companies. People will remote in from their home locations or wherever they’re working into our facility, into their edit bay here at our facility, and they will work as if they were here in person. Our infrastructure on-premise is the same as if you were in-person or remote.  

Scotty Eagleton

All of our worlds changed when COVID hit and everybody did work from home, but that’s basically remoting in. We really don’t use cloud for much of our real day-to-day editing. We primarily use the cloud or high-speed internet for transferring footage from one location to another and then for MediaSilo or Frame.io or review and approval. 

Do you have anything on the horizon that you’d like to talk about?  
Eagleton: 3BMG owns a company called Warehouse Studios, which Neil and I oversee. It was founded because independent producers who are selling content have a need for reliable production, post and accounting services. Often, a producer who sells a piece of content to a network won’t have anywhere to take and produce it. That’s where we step in to assign a whole team to ensure that they’re taken care of every step of the way. It’s another layer to the services that we offer.  


Alyssa Heater is a writer and marketer in the entertainment industry. When not writing, you can find her front row at heavy metal shows or remodeling her cabin in the San Gabriel Mountains.

2023’s Media and Entertainment Storage Trends

By Tom Coughlin

Technology and workflows for M&E are always evolving, and that includes the world of storage. I have been working in and studying this area of the industry for decades, which gives me a unique perspective.

For this special issue, I’ll be sharing my thoughts on digital storage and applications from the past year based on my experience as well as on conversations and interviews with others working in this part of the market. I’m also including some data from my upcoming 2024 Digital Storage in Media and Entertainment report.

According to the 2023 M&E Professional Media Storage Survey, flash memory is the primary storage media for content capture, although close to 30% of this storage was on HDDs, mostly for in-studio storage.

When participants were asked about their use of direct-attached and network storage in digital editing and post production, that survey showed the following summary statistics:

  • 53% had direct-attached storage.
  • Many survey participants were not using flash memory in their post production DAS, although the percentage of flash memory increased from prior surveys.
  • 3% had NAS or SAN.
  • About 36.4% had more than 500TB of NAS/SAN storage.

 IBC
At IBC back in September, Blackmagic introduced its free Blackmagic Camera App for the iPhone, offering the same tools and user interface as on Blackmagic cameras. The app creates standard 10-bit Apple ProRes files up to 4K or H.265 or H.265 video. It supports Blackmagic Cloud Storage, so creators can collaborate and share media with editors, colorists and others. Blackmagic also reports that it is adding NVMe SSDs to more switches with about 2TB capacity.

At the Seagate booth, I discovered that the company’s 30TB HAMR drives are being tested by potential customers and will be shipping in its Corvault 5U84 storage systems by the end of 2023. The 5U, 1-meter-deep product is designed for shallow racks. (Seagate’s previous offering was 4U and 1.2 meters deep.) These systems use a form of data protection called ADAPT (autonomic distributed allocation protection technology), which uses an erasure code to reduce the time required to rebuild such large hard disk drives. This is especially important as the size of the HDD capacity increases since old methods would take up to a week to rebuild a failed drive. ADAPT allows a drive rebuild in one hour.

Pixitmedia is including the Corvault 5U84 with its software-defined storage solution for media workflows. Perifery’s Object Matrix and Swarm object storage technology have been integrated with Seagate’s XSAP and Corvault platforms using the ADAPT technology and ADR.

Seagate is also integrating Arcitecta Mediaflux with its Exos Corvault 5U84 and Lyve Cloud, which enables media storage and sharing.

Toshiba had a small booth at IBC displaying its HDD storage products, including the 3.5-inch conventional magnetic recording (CMR) 20TB MG10 SAS HDDs in a Promise Technology 60-bay JBOD VTrak J5960 HDD system capable of up to 1.2PB of raw data storage.

EVS was showing its latest XStore shared storage for EVS live and nonlive production workflows. This product mostly uses HDDs.

OWC announced extensions of its all-flash-based Jellyfish product line — Jellyfish Nomad and Jellyfish Studio products. The OWC Jellyfish Nomad is a portable shared media pool designed for DITs, independent 3D studios and on-the-go editors. It enables them to work on the same project and access the same assets. Built with 6000MB/s of aggregate bandwidth, capacities up to 64TB of user-swappable NVMe drives, six direct-attached 10GbE ports and 128GB of RAM, the Jellyfish Nomad can handle RAW files, multi-camera projects, image sequences and VR.

The OWC Jellyfish Studio, which replaces the popular Jellyfish Mobile, offers leaps in speed, storage and connectivity. With 3500MB/s of sustained read and write speeds, capacities up to 180TB raw SSD storage and 14 direct-attached ports (6x10GbE and 8x1Gbe), users can share 4K content, documentaries, feature films, episodic content or massive amounts of social media at every resolution and aspect ratio.

Promise Technology launched its Pegasus R12 at IBC. This is a 12-bay RAID storage product that works with Mac and Windows workstations. The Pegasus R12 offers Thunderbolt 4 storage for direct-connect access and uses the company’s PromiseRAID and NVMeBoost technologies. It has data transfer speeds of up to 3GB/s and 240TB of storage using Toshiba 20TB near-line HDDs. It also announced a partnership with AJA Video Systems.

Avid was at IBC showing its Media Composer using Avid Nexis video storage running on AWS for secure, scalable content creation workflows. Avid Nexis includes the new Avid Nexis VFS virtual file system and a Nexis Flex subscription. The Nexis F-series storage engines, or Avid Nexis in a cloud or hybrid environment, enable collaborative shared storage workflows.

OpenDrives was showing its Atlas storage management software on multiple certified hardware platforms. It provides access to data on-premises or in the cloud.

ATTO demo’d its networking storage for media workflows, including host bus adapters, network interface cards, NVMe switch host adapters and Thunderbolt adapters.

Rohde & Schwarz highlighted its SpycerNode2 high-performance computing storage for mixed-content workflows. It allows project-based production work with SpycerPAM for production asset management. It supports scale-up/scale-out unified file and block access with a dynamic media cache and S3 tiering and provides up to 67PB of storage and 12GB/s performance in one stack.

 

The new release of SpycerPAM allows project-driven management of storage tiers and infrastructure as well as integration into nonlinear editors such as those from Avid and Adobe. It allows review and approval workflows across multi-sites, and R&S says it has stakeholder approval for remote workflows.

EditShare, maker of EFS media-optimized shared storage, announced that it merged with Shift Media and now offers cloud-native video solutions such as MediaSilo for collaborative workflows. EditShare has been making moves into remote and collaborative workflows with, for instance, its 2022 introduction of EditShare Flex, built on AWS.

Cloud Storage
The use of cloud storage in the media and entertainment industry is increasing. In the latest survey, about 63% of the respondents said that they had 1TB or more storage capacity in the cloud, up from about 32% in earlier surveys.

AWS was at IBC showing off partner technology offerings. Many of the demonstrations involved various AWS cloud storage offerings. AWS storage is in use for content production, editing and archiving as well as to support application of AI on media content for various metadata-generation tasks. At its AWS Storage Day in August, the company said that its FSx Windows file servers can now support up to 350,000 IOPS, and EFS can now do 55,000 read and 25,000 write IOPS per file system. File restorage times for content in S3 Glacier are improved up to 85% without additional costs.

Wasabi announced strategic partnerships with Grass Valley and GrayMeta. Wasabi provides cloud storage for Grass Valley’s Agile Media Processing Platform (AMPP), serving as a hot cloud storage target in the AMPP ecosystem. Content files can move between Wasabi storage and AMPP applications without incurring transfer fees. Using an integration with GrayMeta Curio, Wasabi is previewing its SmartBuckets technology, which can create more personalized sports fan experiences with hyperspecific details. This generates rich metadata for media libraries stored in Wasabi cloud storage, allowing fast location and retrieval of media segments based on people, places, events, emotions, logos, landmarks, background audio and many more attributes.

During a conversation with Wasabi CTO Jeff Flowers, he said that the company’s M&E content includes a lot of sports videos in its cloud. The company uploads about 3PB of data per day. He also said that Wasabi is considering moving more of its digital storage from HDDs to SSDs. Dense SSDs take up less rack space, have higher performance and last longer than HDDs. Today, though, SSDs mainly serve as cache memory. He also said that local storage downloads and transfer services, like AWS Snowball and the Wasabi Ball, are not used much anymore because of the increasing availability of fast networks.

Wasabi also released a 2023 Global Cloud Storage Index. Here are some of their results:

  • M&E cloud storage use is increasing rapidly, with 89% of organizations looking to increase (74%) or maintain (15%) their cloud services.
  • 54% of M&E organizations went over budget expectations for cloud spending in the last year, and half (49%) of M&E organizations’ public cloud storage spending was spent on fees, with the balance spent on storage capacity used.
  • 69% of M&E respondents had been using cloud storage for three years or less.
  • 95% of these respondents said they migrated storage from on-prem to public cloud in the past year.
  • About half (45%) of M&E organizations reported using more than one public cloud provider. Data security requirements were one of the top reasons why M&E organizations were choosing a multi-cloud strategy (44%) to different buying centers in an organization making their own purchase decisions (47%)

Storj has been working in the M&E space as well, providing a very distributed S3 storage network that it says can provide CDN-like performance. Every file is encrypted and split into pieces before being stored on thousands of global nodes, providing content security.

GB Labs demonstrated remote-working solutions that combine its global unified file system, Unify Hub, with a cloud storage solution developed with Storj.

Hammerspace demo’d its data orchestration software used in Autodesk Flame workflows that serve hybrid cloud storage, remote collaborative data access and metadata-driven work.

Before IBC, Cinedeck announced cloud storage workflows with Backblaze and Wasabi as part of its ConneX media processing platform.

Content Delivery
In the latest survey, the average hours on central content delivery system was about 1,200, and 475 hours were ingested monthly. Flash memory use has been increasing for content delivery.

Eluvio announced new capabilities to deliver and monetize live FAST channels, PVOD and interactive media experiences using the Eluvio Content Fabric. The company also announced a Media Wallet for connected TV platforms including Apple TV, Android TV and Fire TV. The company gave a demonstration at its IBC booth.

The company says that the Eluvio Content Fabric is an open and decentralized streaming, content distribution and storage network built for the third-generation internet. Content Fabric delivers live streams with deterministic end-to-end latencies of two seconds globally to standard streaming clients (DASH/HLS over HTTP). It provides a complete, full-featured media stack to publish, store and deliver content at scale, including personalization, access control, content protection and proof of engagement.

Varnish Software showed its media content delivery network. The company published a white paper with Intel showing test results on scalability of both CDN edge node performance and energy efficiency across four different performance levels. The bottom line is that working with Intel Xeon CPU-based servers, Varnish was able to demonstrate up to 1.2Tbps data rates and up to 1.18Gbps per watt efficiencies. Below is the test setup that Varnish Software used for this demonstration.

LucidLink showed high-performance streaming direct from cloud storage with a user interface that looks like a local drive. This allows remote teams to project files of any size and type within seconds to support video and audio production workflows.

Archive and Preservation
Of the survey participants, 46.7% had more than 2,000 hours of content in a long-term archive. And 41.4% of the respondents said their annual archive growth rate was higher than 6%. About 30.0% had more than 2,000 hours of unconverted analog content, and the average rate of analog content digitization was about 6.8%.

Fujifilm’s Kangaroo is a plug-and-play, long-term, object-based data archiving solution. This mobile box includes all the hardware (including a server), software and magnetic tapes to archive data on-premises. It can also be used as part of a hybrid cloud storage system, and file versioning makes it easier to recover data in case of a cyberattack.

The company says that the data is stored in an open format and is triple-checked for consistency. The image below shows Fujifilm’s object archive software operations and the Kangaroo archive box. In addition, the use of magnetic tape keeps physical storage away from a network (providing an air gap) and thus reduces vulnerability to malware and ransomware.

Quantum announced its ActiveScale Cold Storage offering, which is an S3-enabled object storage for active and cold data sets. Quantum says that ActiveScale Cold Storage can reduce cold storage costs by up to 60% with your own on-prem cloud storage. This offering combines advanced object storage software with magnetic tape technology. Quantum is offering the product in preconfigured bundles.

Quantum also showed its Myriad all-flash, scale-out file and object storage system that can support AI, VFX and animation workflows with high performance and low latency. It has a cloud-native architecture and includes deduplication, compression, snapshots, clones and metadata tagging.

Memnon is a 20-year-old content preservation and migration service provider with a long history of supporting the M&E industry. The company said it manages 3.5 million assets, mostly magnetic tapes, and that archiving inquiries have increased by 100% in the past six months, with one project encompassing 29PB of content.

Point Archival Gateway out of Germany offers a tape-based object storage system. It has a standardized S3 REST API gateway that can be used with many S3-capable applications.

Disk Archive out of the UK offers the Alto-III HDD-based enterprise archive. To reduce the power consumption of the HDDs in the archives, any disk that is not reading or writing is switched off. The company says that this saves on energy consumption and extends the life of the HDDs. Users can access HDD storage data more quickly than with a tape library system.

Disk Archive reports that Alto uses data replication techniques that are better than RAID and offers selectable protection — 1X, 2X and 3X replication (probably with erasure codes). The company said that you can create geographically distributed storage systems for business continuity and disaster recovery. Up to 1.2PB is available in a 4U unit.

Axle AI announced a partnership with Disk Archive, offering media management bundles.

Lasergraphics and others displayed their film-scanning systems at IBC. There were several providers of these film scanners at the show, perhaps indicating that interest in digitizing current vulnerable analog assets is increasing. Lasergraphics is offering options for scanning at 13.5K, 10K or 6.5K maximum resolution.

Future-proofing content is best done at the highest resolutions available. These systems can handle 70mm, 65mm, 35mm, 16mm and 8mm film gauges. There are also features to detect and correct dust and scratches while scanning. Many of these systems hold individual frames flat and motionless during the scan to optimize scans of warped and damaged film.

2024 Digital Storage in Media and Entertainment Report
The 2024 report will be the 19th annual from Coughlin Associates. The report analyzes requirements and trends in worldwide data storage for entertainment content acquisition, editing and special effects, archiving and digital preservation as well as digital cinema, broadcast, satellite, cable, network, internet, OTT and VOD distribution. Capacity and performance trends as well as media projections are made for each of the various market segments. Industry storage capacity and revenue projections include direct-attached storage, cloud (including object storage), and real-time and near-line network storage. The 2023 report should be available by January 2024.


Dr. Tom Coughlin is a digital storage analyst. He has over 40 years’ experience in the data storage industry. Coughlin Associates consults, publishes books and reports and holds digital storage-oriented events.

Rules to Live By: Using Cloud, Hybrid or On-Prem Storage

By Nick Pearce

If you are a creative professional, then you will need some sort of storage to capture, protect, process and preserve the cool stuff you create. Recommending which type of storage depends on where the content came from, what you plan to do with it, the scale of your operation, your need to collaborate, the need for redundancy, how long you want to keep your content, and the need for speed.

I’ll dig into that further at the end of this article, but first, regardless of what flavor of storage you use, no creative should be without some form of digital or media asset management. Whether simple and cheap or complex and costly, a system that makes it easy to get your hands on a project or piece of content, regardless of where it lives, will save you time and help you to be more productive.

Keeping these things in mind, you need to live by the following mantras:

  • If you do not have two copies (at least), you do not have it.
  • If you cannot find it, you do not have it.
  • Your data (and metadata) belongs to you, not the technology vendor.
  • Life is too short to manage media!

Taking these rules into account, we can then examine the different types of solutions on the market and their suitability according to the size of the operation. However, there are some other questions to ponder:

  • How valuable is your time, and do you want to use that time generating revenue or managing media?
  • Do you have the right infrastructure, resources and, again, time to manage the storage required to get the job done?
  • Do you own the content, or do you need to hold on to customer data for a given period of time?
  • How portable will your content be, and does interoperability matter in your workflows?
  • Are you willing to pay extra to ensure you can keep working when things go south or to avoid the loss of reputation for losing client data?

As these questions suggest, it is highly advisable to build automation and some kind of intelligence (artificial or otherwise) into your workflows. This ensures that manual tasks are simplified, which reduces workflow errors.

Why?
Manual error is right up there on the list of answers to “How the *$%! did that happen to my data?!” Given the nontrivial cost of bringing talent back on-set or the potential loss of reputation for mishandling a client’s data, you cannot afford to overlook the availability and protection of your content, from ingest to archive.

Now, let’s delve into the types of storage, their users and some recommended best practices. Keep in mind that effective digital preservation practices require a significant investment in both time and money to ensure they are executed correctly.

External Drives / Desktop RAID Devices
Everyone uses these — from freelancers and production companies to the largest broadcasters. While they are relatively cheap, they are often underused (the average disk is probably only at 65% of capacity*) and they are a pain when searching for content. Some people use them as archives, putting the disks on shelves in the hope that one day they will spin up again when the content is needed. Trust me, this is not a fantastic preservation method.

My recommendation: Make multiple copies (compounding the underutilization) and/or synchronize/back up to a cheap cloud platform (ideally with no egress fees). Do not use external drives or RAID devices as an archive… please.

SAN/NAS/DAS
The vast majority of all high-end collaborative editing workflows use these storage platforms due to the required performance, scale, and number and quality of streams to support. Like Stella Artois, they are reassuringly expensive.

My recommendation: Size your platform for the required throughput first and then for capacity. This tier of storage should be kept lean and mean, with content not in use being automatically migrated to more appropriate second- and third-tier storage platforms (private cloud). Unless the workflow requires editing on growing files, maybe consider ingesting to the second tier of storage (on-prem object storage or cloud … typically more robust) and only move media to production storage once it’s needed.

Cloud
When it comes to storage, the cloud comes in many flavors — far too many to cover in detail here. So I will summarize:

  • Public cloud storage is flexible, scalable and provides myriad workflow options that can be turned on and off without you putting a single bit of equipment in your facility. However, the hyperscaler offerings are more unpredictable in terms of managing cost than the weather in Wales (my homeland). If you use it for the right things and manage your usage well, then there should be no cold, sharp showers in July. Sadly, human nature is not to do that, and thus some get shocked back on-prem. Also, dev-ops or integrator services are often needed to make public cloud storage work and make it secure. This is not trivial.
  • Private cloud (object storage) provides all the benefits of on-prem storage with more predictable performance and costs. Some solutions add valuable benefits, such as processing the content where it lives (metadata indexing, etc.), audits, immutability, etc. Scaling up and down in a hurry is not so easy and, of course, it still needs to be backed up somewhere.
  • Managed storage services are the concierge services of the storage industry. Typically, the service provider will hold your hand throughout the lifetime of the relationship and often provide other services to manage your content.

My recommendation: In my opinion, hybrid cloud storage is where it’s at right now. Keeping just the right amount of on-prem storage and then tiering the rest of the content to some form of cloud offering (public or managed services) should provide tighter control over your precious data.

Choosing an archive platform that offers more than just storage is essential. The ability to deploy data services where the data lives, as well as all the expected functionality (DR, audits, immutability, etc.), will bring operational and financial benefits to your organization.

LTO
I’m not a fan of LTO for post workflows. Sue me. But when preparing your filing, consider that even the most ardent LTO fan will concede that disk-based archives are needed when working with content that might need frequent access. If you have fire-and-forget content, then maybe LTO is for you, but let’s be honest: If it’s fire-and-forget content, then why keep it at all? #controversial

Semi-joking aside, as the importance of sustainability increases in our industry, the need for technology that uses less power will be more in demand. Increasingly, people will need to live with the trade-offs when it comes to accessing all content instantly.

Other Reasons to Be Cheerful
There are some less obvious risks to be aware of when protecting your content:

  • No more content jails! — Make sure your content (and metadata) is stored in nonproprietary formats with their original file names and folder structures. Open access to content within an organization is crucial to ensuring data can be used, reused and used again by all workflows.
  • People! — It is a risk to protect your content by relying on custom scripts that are created and managed by a single person. This happens a lot. Home-grown solutions inevitably lead to disaster. Buy a supported product, even if it costs a little more, and you will sleep better.
  • Industry-focused vendors – They are preferable to vendors that sell and run. Build a relationship with your vendor, and ask to meet the operations and tech teams if and where possible. Buying a lot of storage cheap and getting terrible customer service is a bad thing. Don’t do it.
  • I want it all – Storage vendors in the M&E industry that claim to provide a single storage platform to cover all your workflows tend to come, go and not come back. Be wary of putting all your eggs in one basket!
  • You’ve got me, but who’s got you? — Every storage system needs a remote backup, even public cloud. If you are using single-region cloud services, then you, too, are vulnerable to your data being unavailable at some point. Multi-cloud, business continuity and disaster recovery services cost more, but then so does the impact of not being able to work at all.
  • Sustainability, as mentioned, is rightly coming into focus. Should all data always be available? Is that a responsible approach if it means keeping servers on 24/7/365?
  • When procuring your next storage platform, be sure to ask your supplier what they are doing on the sustainability front.

 

MAMs, DAMS and PAMS
As mentioned upfront, finding content is supremely important, whether monetizing archives or collaborating with global teams. The integration of AI (artificial intelligence) into workflows to assist in describing the content is also making that job more achievable. However, not everyone is ready for the benefits AI can bring, and there is a need to balance cost with the realistic need to find and use that content. My ultimate advice here is to find a platform that is simple, easy to use, meets your general requirements and does not confine your data in a content jail. Easier said than done, but those solutions are out there.

My 2 Cents
Storage platforms will inevitably become invisible, as the interface to your content will be the key to your success, and AI services, built into the fabric of your storage infrastructure, will be responsible for many creative workflows — from helping you find your content to creating news timelines from archive footage based on public sentiment and recent events.

Alongside that, I predict that many will use managed services to offload the responsibility for managing large data repositories, freeing up valuable resources to work on more creative endeavors.

Until then, my general advice when choosing storage (likely to be a mix of all the above) or asset management solutions is to work with an integrator or reseller you trust. They should know enough about your organization and business requirements to recommend a platform that best suits the needs. This should take the entire workflow in mind and not just the storage requirements. If your business is too small to engage with an integrator, then speak to your peers in other companies to learn how they are handling things.

The great thing about this industry is that there will always be someone willing to help. You just need to ask!

*Anecdotal – based on 20 years in the storage and archiving industry. Happy to argue it.

Sound Lounge: Two Mixers, a Sound Designer and 25 Years

Marshall Grupp

By Marshall Grupp

I can vividly recall the day I visited Tom Jucarone’s studio at East Side Film and Video as if it were yesterday, asking him if he’d be up for creating a unique audio post space unlike any we had seen before. Once he agreed, Peter Holcomb joined the team, and, as the saying goes, the rest is history.

Sound Lounge was born with a clear vision – to establish studios that offered a welcoming ambiance with lots of natural light, high ceilings and top-notch technology. Our belief was simple: We wanted our clients to feel at home while we helped them showcase their stories in the best way possible.

Our paths to this venture were unique. Tom and Peter had years of collaboration on top-tier advertising projects and Super Bowl spots, whereas my journey into 30- and 60-second commercials started after being a sound editor for feature films and TV shows. Our paths serendipitously intersected during a project for a Coke campaign, marking the inception of our collaboration.

From the outset, my partners and I made a point of recognizing and nurturing talent, a key factor in our lasting success. Numerous artists who took their first steps in the industry at Sound Lounge have gone on to become some of the most successful audio mixers in New York City. Reflecting on my 40-plus years in this industry, I have always believed in the importance of versatility as an audio engineer. For example, Pete Crimi, one of our mixers, has demonstrated this by contributing to both Super Bowl commercials and the HBO series How To With John Wilson.

What Sound Lounge looked like in the late ’90s.

Throughout our 25-year journey, I have often reflected on how we evolved from simply mixing commercials to providing extensive audio production services for television series like The Bear and The Crowded Room, as well as movies such as The Place Beyond the Pines, RBG and Theater Camp. It was always my belief that Sound Lounge was meant to handle all things related to sound.

I was determined not to limit us to being seen as solely a facility for commercials.  Over the years, we started new divisions and added services to our portfolio. In 2002, we introduced radio production and a casting division, assisting clients in voice-over talent selection. Drawing from my experience in long-format content, we took another significant step in 2005 by constructing a Dolby-certified theater. Seeking to extend our reach beyond New York City, we created the technology used to establish Sound Lounge Everywhere, a remote studio located in Boston. This technology allowed us to swiftly adapt when the country faced a sudden shutdown on March 13, 2020. Our expertise in remote services enabled us to resume operations almost immediately, which is a testament to our long-standing leadership and adaptability in the field.

Sound Lounge throw-back photo

In our line of work, it’s undeniably a “relationship” business. While possessing the creative and technical chops is crucial for handling the caliber of projects we undertake, it is equally important to recognize that we are entrusted with the creative visions of art directors, copywriters, film editors, directors and showrunners. These individuals invest months of effort into crafting their work, and we embrace opportunities to create and collaborate with them.

At the onset of my career, I wasn’t particularly inclined toward the technical aspects of audio post. My career goes back to the days of Steenbecks, Moviolas, film splicers and 35 mag dub machines. Remember the days of videocassettes and DATs? Fortunately, Sound Lounge had Tom Jucarone at the helm. In addition to being an incredible audio engineer, he is always exploring and staying current with the latest technologies in the audio post world. From the very beginning, we focused on constructing a facility and have consistently invested in upgrades to maintain our position as an industry leader.

Present day

But our journey has not been without its trials, having weathered the real storms of 9/11, the 2008 recession, the impact of Hurricane Sandy, an unexpected steam pipe explosion on 5th Avenue and the multitude of challenges posed by the COVID-19 pandemic.

Finally, a few lessons learned: Employees first! Listen, learn and foster a culture that values your team. Success breeds from the inside out. Take chances! Not every new division or collaboration will be successful, but the majority of the time, the benefits outweigh the risks. You can teach an old dog new tricks.

It has been an incredible journey, and we are eager to embrace what lies ahead. Our primary focus moving forward is to safeguard the legacy we built, ensuring that, in another 25 years and beyond, Sound Lounge will still be serving our industry.


Marshall Grupp has been in audio post production for 45 years, winning many awards for his sound design work in films, television shows and commercials. He is managing partner and COO of Sound Lounge in NYC.

Source Elements’ New Streaming Suite for Remote Work

Source Elements is now offering the Source-Nexus Suite, an expansion of its Source-Nexus DAW audio-routing plugin that now includes high-quality audio and video streaming features for media pros.

The Source-Nexus Suite includes three main features offering everything necessary for remote collaboration and review. Source-Nexus I/O is an audio input-output routing solution that integrates DAWs/NLEs seamlessly with other applications. Source-Nexus Review takes away the complexity of audio routing for remote collaboration, enabling secure review sessions. Source-Nexus Gateway glues these three features together, offering HD video conferencing with high-quality audio support and Ultra HD frame-rate streaming. This workflow keeps the focus on the project while improving the review process for engineers and clients alike.

Source Elements co-founder and head of innovation Robert Marshall says, “Source-Nexus Suite means that I, as a working sound engineer, can rely on sharing

time-critical work with remote clients without worrying about compromising audio quality on consumer video conferencing systems.”

The new Source-Nexus Suite can serve as a stand-alone toolset, but when paired with Source Element’s Source-Connect, it creates an environment for remote recording sessions to run smoothly with virtually no setup time, keeping clients and talent focused. The suite offers advanced remote reviewing and recording features, while the interoperability between DAWs and browsers keeps the workflow simple for recording engineers helping media projects stay on time and within budget.

Source-Nexus Suite is tailored for media industry workstations, including those of video and picture editors. “Source-Nexus Suite made for very efficient virtual edit sessions on a feature film and web series when our team was spread out across the country,” says Suite One Productions editor Sarah Taylor, CCE.

Offering full support for Windows and macOS Apple Silicon and Intel, Source-Nexus Suite allows most DAWs and NLEs to work as a high-quality remote production suite featuring HD video and audio communications over a browser.

Source-Nexus Suite is available now, with monthly subscriptions starting at $11.95 with no commitment. All subscribers get access to early releases and ongoing features, including the soon-to-be-released Source-Nexus Router software for unlimited desktop routing. Those with existing licenses can upgrade for a special price.

Some Key Features
● Up to five participants in remote sessions
● Expands to 25+ participants when paired with Source-Live Low Latency
● Better-than-broadcast-quality audio
● High-frame-rate video review and approval
● Video chat with isolated audio in addition to dedicated stereo review stream
● A built-in local recorder
● Seamless integration with Source-Connect