Author Archives: Dayna McCallum

Cinnafilm’s PixelStrings Integrates With Perifery’s Object Matrix

Cinnafilm and Perifery, a division of DataCore, have announced a new partnership that makes Perifery’s object storage solutions, Object Matrix and Swarm, available alongside Cinnafilm’s media transformation platform, PixelStrings. The integration allows users to seamlessly convert and upscale their media assets stored in Object Matrix within the PixelStrings platform.

Object Matrix is a media-focused object storage platform that benefits organizations by modernizing video workflows. It delivers financial and operational efficiency for multiple media-based workflows by providing instant, secure access to all media archive content from anywhere.

PixelStrings can be integrated and configured wherever an organization stores its media assets: on-premises, data center, private cloud or public cloud. Built on the foundation of Cinnafilm’s 20 years of industry experience, PixelStrings offers applications such as Tachyon and Dark Energy.

Users benefit from the PixelStrings/Object Matrix integration through more efficient asset digitization and advanced storage features, such as scalable archiving and powerful categorizing and search tools. Perifery users’ assets converted through PixelStrings are securely stored and preserved in Object Matrix.

“We’re very pleased to add the option of using Object Matrix for our existing and future customers,” says Dominic Jackson, VP of products and services at Cinnafilm. “PixelStrings is a solution for standards conversion and other media transformation processes and, as such, is designed to work with our customers’ preferred storage platforms. Adding Object Matrix to our network brings added value to our customers, who are looking to enhance both their assets and their workflows.”

Mark Habberfield, senior solutions architect at Perifery, adds, “The combination of Object Matrix and PixelStrings allows media teams to optimize the management of those valuable video assets. Integrating with PixelStrings was a no-brainer; their aim is to enhance the quality of media, and ours is to maintain that media’s integrity. Regardless of file size or specification, our connected solution allows users to confidently convert, store and access their media files from anywhere.”

Sphere: HPA Awards Honor Creativity and Innovation Winner

The HPA Awards committee has announced that Sphere, the next-generation entertainment medium that recently opened in Las Vegas, will be the recipient of the Judges Award for Creativity and Innovation for 2023. The honor will be presented to the Sphere Entertainment Co. team during the HPA Awards gala on November 28 at the Television Academy’s Wolf Theater in Los Angeles.

David Dibble, CEO of MSG Ventures, a division of Sphere Entertainment focused on developing advanced technologies for live entertainment, says, “Every aspect of Sphere — from the technology we created to capture content to how we tell stories in this new, multi-sensory venue — is a giant leap forward, and we are excited by the extraordinary possibilities this new medium offers. The entire Sphere team is honored to accept this award from HPA, and we look forward to continuing to push our industry forward through the innovative work we are doing at Sphere.”  

Sphere offers multi-sensory immersive experiences across film, concerts and major events. To create these exclusive experiences, the in-house immersive content studio, Sphere Studios, develops and uses proprietary and cutting-edge technology and tools.

The world’s largest spherical structure, Sphere enables audiences to share immersive experiences at a never-before-seen scale. Sphere is powered by bespoke technologies, including its interior LED display – which at 16K x 16K resolution is the world’s highest resolution LED screen. The screen wraps up, over and around the audience to create a fully immersive visual environment. To capture the images and video required for this unique canvas, the team at Sphere Studios developed Big Sky, a groundbreaking ultra-high-resolution camera system and custom content creation tool, capable of capturing incredibly detailed, large-format images. In addition, the venue’s immersive sound system, Sphere Immersive Sound powered by Holoplot, is the world’s most advanced concert-grade audio system, ensuring crystal-clear, individualized sound within its unique curved design.

Sphere’s technological accomplishments also extend to its exterior, the Exosphere, which features 580,000 square feet of fully programmable LED panels capable of displaying a myriad of visuals.

The Sphere opened with the residency of U2, featuring U2:UV Achtung Baby Live At Sphere, a spectacular showcase blending their live performance with captivating visual elements. The first film ever created for Sphere, Darren Aronofsky’s Postcard from Earth, debuted as part of The Sphere Experience, a two-part multisensory journey that marks an inventive approach to content creation and storytelling.

The HPA’s Judges Award for Creativity and Innovation recognizes companies and individuals who exhibit outstanding contributions in the realm of creative storytelling and technical innovation. The honor is not awarded annually but is bestowed at the discretion of the jury. This year’s jury, co-chaired by Carolyn Giardina and Joachim Zell, also included Paul Debevec, Jay Holben, Joanne Kim and Karen Raz.

Previous recipients of the HPA Award for Creativity and Innovation include Peter Jackson for They Shall Not Grow Old, NASA and AWS for the first live broadcast from space, the Together at Home broadcast held during the pandemic, and David France and Ryan Laney for the groundbreaking use of technologies in Welcome to Chechnya.

Tickets for the event are available at hpaonline.com.

Harbor Launches Music Supervision Services for Advertising

Post studio Harbor has launched music supervision services to complement its advertising sound capabilities. Music supervision is the latest addition to Harbor’s list of existing advertising capabilities for live action, VFX, design, creative editorial, voiceover casting, ADR, sound mixing, color grading and finishing.

The service will provide holistic music solutions for clients and will be integrated into Harbor’s existing sound offerings. Music supervision capabilities will include creative search and music direction, original composition, budget planning and management, license negation and clearance, and sonic branding.

Harbor has partnered with award-winning music supervisor and sonic strategist Mike Boris to develop the offering. The company notes that collaborating with highly demanding global brands guides Boris’ diversified approach. His portfolio includes work for many of the world’s top brands, including Mastercard, Coke, Microsoft, Wendy’s, L’Oréal, Bloomingdale’s, Verizon, AT&T, Intel, Ford, Amazon and Nike.

Lauren Boyle, senior producer commercial sound for Harbor, says, “I’m thrilled to be expanding our services at Harbor’s sound department. Adding music supervision and partnering with someone as talented as Mike is the latest addition in our relentless pursuit of bringing all craft disciplines together under one roof to enhance the creative experience. Expanding into the music realm is the next step in our mission to be a one-stop shop for all our clients’ audio needs. Having these disciplines under one roof allows for greater creative control when it comes to integrating music, casting, sound design, edit and mix. It’s not just about adding music; it’s about delivering a complete sound experience.”

Main image: L-R: Harbor’s Steve Perski, Mike Boris and Lauren Boyle

LucidLink Launches Multi-Filespace Connect for Desktop App

LucidLink, makers of real-time remote collaboration software for creative teams, has unveiled a new product feature, Multi-Filespace Connect. Driven by customer feedback, users can now connect to multiple Filespaces simultaneously within the newly redesigned LucidLink Desktop application.

“We believe that introducing the Multi-Filespace Connect feature in our desktop app represents a major step forward in convenience and efficiency for our users,” says Kalina Tsoneva, senior product manager of LucidLink. “This feature greatly simplifies the movement of assets across various projects or into archives, and it significantly improves workflows for those handling tasks across different Filespaces. It will make a real difference in the day-to-day experience of our users, offering them a smoother and more convenient way to manage their projects and data life cycles.”

Key features of Multi-Filespace Connect include:

●      With Multi-Filespace Connect, users can effortlessly connect to multiple LucidLink Filespaces without needing to disconnect a Filespace first, improving workflow efficiency for both creative professionals and the administrators who support them.

●      Multi-Filespace Connect maintains LucidLink’s support for  Single Sign-On (SSO) integrations, now allowing for simultaneous integrations with multiple LucidLink Filespaces.

●      The LucidLink Desktop application has undergone a comprehensive redesign, presenting a unified interface that consolidates numerous LucidLink functions.

The Multi-Filespace Connect update is free, and it is now available within the LucidLink application.

LucidLink also announced that it has raised $75 million in Series C funding, including secondaries, led by Brighton Park Capital, an investment firm focused on entrepreneur-led, growth-stage companies within the software, healthcare and tech-enabled services businesses space. Major existing investors, including Headline, Baseline Ventures and Adobe Ventures, also participated.

May December Editor Affonso Gonçalves Talks Workflow

By Iain Blair

Writer/director Todd Haynes, who was Oscar-nominated for his ‘50s romantic drama Far from Heaven, has always been drawn to classic melodrama and period pieces that examine provocative issues. His new film, May December, tells the story of a shocking affair between 36-year-old Gracie (Julianne Moore) and 13-year-old Joe (Charles Melton). But it’s the fallout from the salacious tabloid-ready romance that Haynes is most interested in. Some 20 years later, Gracie and Joe now lead a seemingly picture-perfect suburban life. But their domestic bliss is disrupted when a famous actress (Natalie Portman) arrives to research her upcoming film role as Gracie.

The film was edited by Affonso Gonçalves, Haynes’ go-to editor who also cut Carol, Wonderstruck and The Velvet Underground. I spoke with Gonçalves, whose credits include True Detective, Winter’s Bone and Beasts of the Southern Wild, about the project’s challenges and workflow.

You’ve cut six of Todd’s projects, starting with Mildred Pierce. But this was a very different type of project, so how did you collaborate on May December?
Usually what happens with Todd, and it happened with this film, is that as soon as he has a script, he sends it my way. If I step back a little bit, what happened was that when we were cutting the Velvet Underground documentary, he was preparing to do this film about Peggy Lee. That eventually didn’t happen, and then this script came up.

He was talking with Natalie Portman about other projects, but she had this one she offered to him, and he really liked it. He sent the script to me, I read it, and I thought it was great. And it came about pretty fast. He sent it to Julianne Moore, they found Charles Melton and that was that.

Then Todd did what he always does, which is work on this kind of image book. And that’s visually how he communicates with the DP, with the production designer, and he sends that to the main people involved. So I got that, and he was also talking about the films that Natalie was watching and stuff for this, and definitely the music from The Go-Be­tween that ended up being a really big part of the creative process.

What kind of discussions did you have before you started work on this? Or did you just get the material and start cutting?

That’s basically what happens, because with Todd it’s not really a discussion about how it’s going to happen. We just talk about the film itself, and how it’s going to work. And Todd has a very specific way of working, which is he never, ever watches dailies. He only watches dailies if there’s a technical problem, or if there’s a performance issue or something like that.

As he doesn’t like to do dailies, I’m cutting everything at the same time and my assistant does DVDs of all the footage. By the time he’s finished shooting, he takes a little break, comes back and he watches all the footage by himself. He takes notes, then he sends the notes to me, and then I kind of do a little bit. I already, by the time he’s finished shooting, have a version of the film that I cut while he was shooting. But then he asks me to do a version based on his notes. And then from that point on, we start working together.

So you are not on the set at all?
Correct. For this one he was shooting in Savannah, and then we met up in Portland, Oregon, where he lives. Then we cut the film for the most part in Portland.

So what were the main challenges of cutting the film?
There were two main challenges, and I think I would say they’re fun challenges. One is the tone, and how to keep the right tone because it’s a story that unravels as it moves, and there are so many layers to it. And you have to start questioning yourself. You have your ideas of what these people are and what’s happening to them. And then slowly the more you know, you question your concepts of relationships, your moral integrity and their moral integrity. It’s really interesting. Plus, there’s humor. So how to keep [the right tone] for something that is almost like a thriller and also a melodrama, but it’s also a dark comedy. Keeping those tones moving in and out, and weaving them to make sense, that was a challenge.

The other challenge, obviously, was working on a soundtrack that already existed. The way we usually do it is, you cut the film, and then you come up with a temp track that kind of feels, that pushes, which is all motion. So I had to reverse engineer the use of it. That was an interesting challenge. And ultimately our composer did a really beautiful job in adapting and changing it a little bit to fit.

Todd told me you have “a great ear” for music and that you’re “very attentive to temp tracks and finding really useful music to cut to.” But this was very different, as composer Marcelo Zarvos adapted Michel Legrand’s score from the 1971 film The Go-Be­tween. It’s like a very strong counterpoint to what we’re watching on screen, and it must have impacted your approach to the edit?
It did. I had to really listen to it and understand what it was doing for the film. It’s like, okay, this applies a bit differently. One part of the process that I love is doing the temp track and finding the music. And finding the music that maybe is not what you expect and maybe in tone. But for this one, Todd had very specific ideas [about how to use the score] that were even in the script, where he had a specific track number that was going to fit with a specific scene. So I just tried it, and sometimes it felt like, wait, this doesn’t seem right to me. But the more I worked on it, the more it was like, okay, now I understand what Todd wants from this.

I could adjust the cutting, or I could get another piece from another cue that actually enhanced the scene a little more. And then you really get to study that piece of music, as there’s a beginning and end, and I can just start a little further in, or end at this point. So you start really trying to maneuver the music to fit what I’m cutting.

Todd has always said he’s a very hands-on guy in the editing.
Yes, he’s right there. Basically, we worked nine to seven, five days a week. Sometimes we worked weekends too, because on top of cutting the film, we were actually cutting a short film for a showing of his films at the Pompidou in Paris. But yeah, he’s by my side from nine to seven. He has notes, and he knows the footage intimately because he studies it. He can be like, ‘On scene six, let’s take a look at take four, and how Julianne reads this line here,’ and that’s how it goes. It’s very, very detailed.

We used Avid Media Composer 2018, QNAP was the server, and then we used Nextcloud to sync files. We had about 4T worth of material.

What was the most difficult scene to cut and why?
I think the most difficult to cut was probably the scene where it’s the first time they all had dinner together, when Elizabeth comes to Gracie’s house, because that was a long dialogue scene. It was actually much longer than it is in the final film. And the challenge was really when to be with whom, because of the coverage. We have the medium closeup of the three of them and the boy, but then the boy leaves, and then it’s just the main wide front or the wide from the back. So it’s just basically finding a way to get to know everybody. And at some point, Joe doesn’t speak, but we have him listening to what was said, and it was so important when we realized we have to be on a silent Joe just reacting to what’s being said there.

It was a week of work just on that, and that’s the scene we probably went back to the most times, like, is this correct, is this precise? Do we have enough of Joe? Even though he doesn’t say too much, are we getting his reaction? There’s so much subtext in his reaction, because when he looks, he reacts to things. And he’s clenching his jaw so much that there’s an effect on the side of his head. So what’s funny was actually using the stuff that’s not being said, and all the stuff being reacted to is where we really had to be very precise. We kept going back to the scene many, many times.

You’ve worked with Todd a long time now. How do you sum this one up compared with some of the other projects you’ve done?
Wow, that’s an interesting question. It was fun and interesting to do something that has more humor than we used to have [in our films]. To play with humor is something that was kind of new for the two of us, and that was exciting. Even though it is a dark story ultimately, I think to be able to play up the humor and understand how to use it was a great experience.


Industry insider Iain Blair has been interviewing the biggest directors in Hollywood and around the world for years. He is a regular contributor to Variety and has written for such outlets as Reuters, The Chicago Tribune, The Los Angeles Times and the Boston Globe.

Anniversary

Cheers to 10 Years, An Anniversary to Remember…!

By Randi Altman and Dayna McCallum

It doesn’t seem possible that 10 years have gone by… it passed in the blink of an eye. A huge thank you to the Platinum Sponsors of our 10th Anniversary celebration — Dell Technologies & Signiant; our Gold Sponsors — Blackmagic Design, The Studio – B&H, Maxon, Pixotope and Puget Systems; and our Silver Sponsors — Grithaus Agency, Artisans PR, Bubble Agency & Raz PR!

With gratitude to all of the wonderful sponsors we’ve partnered with over the last 10 years. And the biggest thank you of all is saved for our readers — who made our 10 year anniversary possible!

Enjoy just a few fun pictures from the celebration. And Cheers to the NEXT 10 Years…!

 

SIGGRAPH 2023: Challenges in 3D Printing, Textile Production and Beyond

Research expected to debut as part of the SIGGRAPH 2023 conference Technical Papers program are “ones to watch” — exciting new technologies, ideas and algorithms that span all areas of graphics and interactive techniques. Particularly noticeable this year is the emergence of research methods that extend beyond the digital world and address creation of real-life content.

“A lot of research in computer graphics has been about determining the best way to visualize various real-world phenomena using computers, and in doing so, there are often details that are elided to fit such complicated things into the computational framework,” says Jenny Lin, a lead author of one of the featured new research projects that will be showcased at SIGGRAPH 2023. Lin and her collaborators have devised a formal semantics framework for machine knitting programs, applying mathematics to describe anything a knitting machine can make.

“Going from virtual representations to the physical world involves addressing details that we often take for granted when going about our daily lives,” she adds. “There’s a very natural connection between these two directions, and there’s something very gratifying about using the language of computers to understand and improve something as tactile and grounded as knitting.”

As a preview of the Technical Papers program, here is a sampling of three novel computational methods and their unique approaches to real-world applications.

Equipped to Make It Fit
Additive manufacturing, better known as 3D printing, brings digitally designed products to life. It allows for unprecedented freedom of 3D geometries and it gives manufacturers the ability to produce parts on demand, and locally. With 3D printing, supply chain management can be simplified, and it is easier to transition from engineering iterations to full manufacturing.

Left: They densely pack the benchmark into a cuboid with a packing density of 35.77%. The packing is free of interlocking. Right: An in-focused view highlighting densely packed objects. “Dense, Interlocking-free and Scalable Spectral Packing of Generic 3D Objects” © 2023 Cui, Rong, Chen, Matusik

3D printing is undergoing a transition from being a prototyping technology to being a manufacturing technology. However, the main roadblock is the overall cost of the manufactured part. 3D printing hardware, materials, and human labor all drive the cost of the technology. The drive for higher cost efficiency requires printing in batches where parts are tightly packed in the 3D printer’s build volume to maximize the number of printed parts per batch. One of the main limitations of this process is the limited utilization of the build volume due to the computational complexity of the packing process.

In a collaboration between MIT and Inkbit, a 3D manufacturer specializing in polymer parts, researchers are addressing the complex problem — and headache — of digitally packing many parts in a single container with multiple constraints. To date, many part models are virtually placed in the printing tray, also referred to as “nesting”, and the printer executes the job printing the whole tray. The problem with this process is that the container isn’t densely packed and there isn’t an efficient method to automate and ensure the 3D printers are printing the maximum volume of parts in a designated container.

The team of researchers, led by Wojciech Matusik, CTO at Inkbit and professor of electrical engineering and of computer science at MIT, developed a novel computational method to maximize the throughput of 3D printers by packing objects as densely as possible and accounting for interlocking-avoidance (between many parts with different shapes and sizes) and scalability. Their approach leverages the Fast Fourier Transform, or FFT, a powerful algorithm that has made it possible to quickly perform complex signal processing operations that were previously impossible or prohibitively expensive.

Coupled with FFT, “our work is making the individual placement of a 3D part into a partially filled build volume as fast as possible,” says Matusik. “Our algorithms are not only extremely fast but they can now achieve print volumes with much higher densities (40% or more). The higher print efficiency will unlock lower cost of parts manufactured.”

The lead author Qiaodong Cui is set to present the new work at SIGGRAPH 2023. The team also includes Victor Rong of MIT, and Desai Chen, research engineer at Inkbit. Visit the team’s page for the full paper and video.

The Refined Knitting Machine
Some may say that knitting is a relatively easy technique, or craft, to learn — and could even serve as a relaxing stress-reducer. Automating the needle-and-yarn technique with machine knitting is well established in the fashion and textiles industries. This has seen a recent surge in popularity due to increased understanding of the scope and complexity of the objects — fabrics, patterns — that can be automatically generated.

While the technique of knitting has been automated, the systems are still struggling with the ability to support everything that a knitting machine can make, as well as generate precisely what a user wants. To date, say the researchers, there is no such system that guarantees correctness on the complete scope of machine knitting programs.

“Semantics and Scheduling for Machine Knitting Compliers” © 2023 Lin, Narayanan, Ikarashi, Ragan-Kelley, Bernstein, McCann

A multi-institutional team of computer scientists from Carnegie Mellon University, MIT and University of Washington, have created a novel computational framework to optimize machine knitting tasks. Their formal semantics for the low-level Domain Specific Language used for knitting machines provides a sophisticated definition of correctness on the exponentially large space of knitting machine programs.

The researchers applied knot theory to develop their new framework and addressed the key properties humans care about in knitting that are poorly captured by existing concepts from knot theory. To that end, they devised an extension to knot theory called “fenced tangles” as a mathematical basis for defining machine knit object equivalence.

Our method “can describe anything a knitting machine can make: not just your standard sweaters and hats, but also dense, shaped structures useful in architecture, and multi-yarn structures that allow for colorwork and soft actuation,” says Jenny Lin, the paper’s lead author and PhD student at Carnegie Mellon in the lab of James McCann, assistant professor of robotics at Carnegie Mellon and another author of the work.

She adds, “This is important, because as we develop more nuanced systems for generating more complicated knitting machine programs, we can now always answer the question of whether two machine knit objects — the object you want and the object your program makes — are truly the same.”

As a proof of concept, the team has implemented a foundational computational tool for applying program rewrites that preserve knit program meaning. This approach could be expanded for characterizing machine knitting to hand knitting, which is both more flexible and variable as a fabrication technique.

The team behind “fenced tangles” also includes Vidya Narayanan, applied scientist at Amazon who was advised by James McCann at Carnegie, Yuka Ikarashi, PhD candidate at MIT Computer Science & Artificial Intelligence Laboratory, Jonathan Ragan-Kelley of MIT Computer Science & Artificial Intelligence Laboratory, and Gilbert Bernstein, assistant professor of computer science and engineering at University of Washington, and they will present their work at SIGGRAPH 2023. The paper and team page can be found here.

Linked and Characterized
Medieval chainmail armor, small metal rings linked together in a pattern to form a mesh, have been used for thousands of years as protective gear for soldiers in battle. Picture a knight in their metal “suit” wearing chainmail armor as an additional layer of protection. Fast forward to the wide landscape of materials and fabrics in the modern era and chainmail-like materials remain a physical structure that is challenging to computationally represent, accounting for all of its unique mechanical properties.

An international team of researchers from ETH Zürich in Switzerland and Université de Montréal in Canada draws inspiration from medieval chainmail armor, generalizing it to the concept of discrete interlocking materials, or DIM.

“Beyond Chainmail: Computational Modeling of Discrete Interlocking Materials” © 2023 Tang, Coros, Thomaszewski

“These materials possess remarkable flexibility, allowing them to adapt to necessary shapes, while also demonstrating impressive strength beyond a certain range of deformation,” says Pengbin Tang, the lead author of the research and PhD student advised by Bernhard Thomaszewski, a senior scientist at ETH Zürich and adjunct professor at the Université de Montréal.

“These unique properties make DIM attractive in robotics, orthotics, sportswear and many other areas of application,” adds Stelian Coros, collaborator and head of the computational robotics lab (CRL) at ETH Zürich.

The researchers have developed a method for computational modeling, mechanical characterization, and macro-scale simulation of these 3D-printed chainmail fabrics made of quasi-rigid interlocking elements (the connectivity of rings or links in chainmail-like material).

A key challenge the new method addresses is accurately representing the deformation limits the quasi-rigid fabric exhibits when it bends and folds and adopts different shapes. Unlike conventional elastic materials, the mechanics of DIM are governed by contacts between individual elements. Their particular structure leads to extremely high contrast in deformation resistance. To obtain the deformation limits from a given DIM, the researchers’ developed a computational approach involving thousands of virtual deformation tests across the entire deformation space.

The novel method offers an intuitive, systematic way for macro-mechanical characterization which can pave the way to using DIM for garment design, note the researchers. Their analysis has largely focused on kinematic motion and, consequently, doesn’t consider friction nor elastic deformations in the structure. In future work, an extension of their macro-scale model could account for internal friction to simulate friction-dominated scenarios as well as explore geometric detail at the element level, which may be important for additional applications.

Pengbin Tang is excited to present this work at SIGGRAPH 2023. View the paper and video on the team page.

Each year, the SIGGRAPH Technical Papers program spans research areas from animation, simulation and imaging to geometry, modeling, human-computer interaction, fabrication, robotics,and more. Visit the SIGGRAPH 2023 website to learn more about the program and for registration details.

Sony Adds 4K HDR Reference Monitor for Color Grading, Live Production, Post

Sony Electronics will soon offer a 30.5-inch 4K HDR professional monitor for critical evaluation, color grading, live production and post. The BVM-HX3110 has a Sony-designed, dual-layer, antireflection LCD panel with Sony proprietary signal processing, which Sony says supports a higher peak luminance of up to 4000cd/m² while maintaining deep, no-compromise blacks.

Along with characteristics such as accurate color reproduction, picture consistency and precision imaging, which are hallmarks of Sony’s BVM-series of monitors, the BVM-HX3110 offers brighter specular highlights and introduces an optional new fast pixel response mode for reduced motion blur. It also provides a wider viewing angle and a standard IP interface for SMPTE ST 2110 signals to complement Sony’s Networked Live ecosystem. The established BVM-HX310 remains available as a companion model to the BVM-HX3110, offering consistency in color reproduction, gamma curve and operation.

“Content creators are always seeking tools that help to accurately match and represent their creative vision, and the BVM-HX3110 does just that,” says Ellen Heine, marketing manager at Sony Electronics. “We continue to act on customers’ feedback, which is why this new model offers a host of new features, including an IP interface for enhanced flexibility. And of course, in keeping with Sony’s monitor design philosophy, it color-matches with our most popular professional monitors.”

The monitor’s standard toolset incorporates waveform monitor/vectorscope, false color, focus assist, closed captioning, 3D LUT processing, and quad and side-by-side viewing modes, among other features. Besides the option for fast pixel response, other supplemental benefits through optional licenses include support for JPEG XS, Simple Network Management Protocol (SNMP), HDR/SDR conversion and a user 3D LUT signal output.

The BVM-HX3110 uses the same color gamut as and works seamlessly with several other Sony monitors, such as the BVM-HX310 and the PVM-X and LMD-A series monitors. This includes the just-announced LMD-A180, an 18.4-inch HD HDR high-grade picture monitor with a wide color gamut. Ideal for on-set monitoring, the LMD-A180 can also be rack-mounted for general monitoring purposes. It replaces the LMD-A170 monitor.

Sony displayed a prototype of the BVM-HX3110 at NAB 2023 and expects to release the product in November. The LMD-A180 release is planned for the fall.

 

 

Meet the Artist Podcast: Your Honor‘s Audio Post Team

postPerspective, which is currently celebrating its 10th year covering the industry, is proud to announce the launch of our new podcast. Found wherever you get your podcast content — Apple, Spotify, iHeart and more — our Meet the Artist podcast series features creatives not only talking about their recent work, but also sharing their personal journey on making it in the industry.

Our premier episode features our own Randi Altman talking with re-recording mixer/ supervising sound editor Jon Greasley and re-recording mixer/sound designer Dan Gamache of King Soundworks, the audio post team behind Season 2 of Showtime’s Your Honor, which stars Bryan Cranston as a disgraced New Orleans judge. You’ll hear them go into detail about heightening sounds and painting a picture of desperation and intrigue with sound, while also talking about their workflow.

Their goal was, as Gamache says, “to do the writing of the show justice.” Greasley adds that they went for “this richness and fullness and level of detail” when creating the sound of Season 2.

For scenes in the French Quarter, the sounds of Your Honor are often heightened. For example, when Cranston’s character Michael is walking down Bourbon Street and it’s dark and he’s all alone and totally in his head. “We like to play with the dynamic of it,” explains Greasley. “But there are times when we do that and times where we don’t, so it creates more of a juxtaposition of when he’s alone or when he’s in his head. And it offers different levels of detail and different states of isolation when he’s also being surrounded by the hustle and bustle.”

And stay tuned for our next Meet the Artist episode, where we “meet” senior colorist Jill Bogdanowicz from Company 3.

Now take a listen to episode one of our new podcast — Meet the Artist with Jon Greasley and Dan Gamache… here.

Quick Chat: Monkeyland Audio’s Trip Brock

By Dayna McCallum

Monkeyland Audio recently expanded its facility, including a new Dolby Atmos equipped mixing stage. The Glendale-based Monkeyland Audio, where fluorescent lights are not allowed and creative expression is always encouraged, now offers three mixing stages, an ADR/Foley stage and six editorial suites.

Trip Brock, the owner of Monkeyland, opened the facility over 10 years ago, but the MPSE Golden Reel Award-winning supervising sound editor and mixer (All the Wilderness), started out in the business more than 23 years ago. We reached out to Brock to find out more about the expansion and where the name Monkeyland came from in the first place…

monkeyland audioOne of your two new stages is Dolby Atmos certified. Why was that important for your business?
We really believe in the Dolby Atmos format and feel it has a lot of growth potential in both the theatrical and television markets. We purpose-built our Atmos stage looking towards the future, giving our independent and studio clients a less expensive, yet completely state-of-the-art alternative to the Atmos stages found on the studio lots.

Can you talk specifically about the gear you are using on the new stages?
All of our stages are running the latest Avid Pro Tools HD 12 software across multiple Mac Pros with Avid HDX hardware. Our 7.1 mixing stage, Reposado, is based around an Avid Icon D-Control console, and Anejo, our Atmos stage, is equipped with dual 24-fader Avid S6 M40 consoles. Monitoring on Anejo is based on a 3-way JBL theatrical system, with 30 channels of discrete Crown DCi amplification, BSS processing and the DAD AX32 front end.

You’ve been in this business for over 23 years. How does that experience color the way you run your shop?
I stumbled into the post sound business coming from a music background, and immediately fell in love with the entire process. After all these years, having worked with and learned so much from so many talented clients and colleagues, I still love what I do and look forward to every day at the office. That’s what I look for and try to cultivate in my creative team — the passion for what we do. There are so many aspects and nuances in the audio post world, and I try to express that to my team — explore all the different areas of our profession, find which role really speaks to you and then embrace it!

You’ve got 10 artists on staff. Why is it important to you to employ a full team of talent, and how do you see that benefiting your clients?
I started Monkeyland as primarily a sound editorial company. Back in the day, this was much more common than the all-inclusive, independent post sound outfits offering ADR, Foley and mixing, which are more common today. The sound editorial crew always worked together in house as a team, which is a theme I’ve always felt was important to maintain as our company made the switch into full service. To us, keeping the team intact and working together at the same location allows for a lot more creative collaboration and synergy than say a set of editors all working by themselves remotely. Having staff in house also allows us flexibility when last minute changes are thrown our way. We are better able to work and communicate as a team, which leads to a superior end product for our clients.

Monkeyland AudioCan you name some of the projects you are working on and what you are doing for them?
We are currently mixing a film called The King’s Daughter, starring Pierce Brosnan and William Hurt. We also recently completed full sound design and editorial, as well as the native Atmos mix, on a new post-apocalyptic feature we are really proud of called The Worthy. Other recent editorial and mixing projects include the latest feature from Director Alan Rudolph, Ray Meets Helen, the 10-episode series Junior for director Zoe Cassavetes, and Three Days To Live, a new eight-episode true-crime series for NBC/Universal.

Most of your stage names are related to tequila… Why is that?
Haha — this is kind of a take-off from the naming of the company itself. When I was looking for a company name, I knew I didn’t want it to include the word “digital” or have any hint toward technology, which seemed to be the norm at the time. A friend in college used to tease me about my “unique” major in audio production, saying stuff like, “What kind of a degree is that? A monkey could be trained to do that.” Thus Monkeyland was born!

Same theory applied to our stage names. When we built the new stages and needed to name them, I knew I didn’t want to go with the traditional stage “A, B, C” or “1, 2, 3,” so we decided on tequila types — Anejo, Reposado, Plata, even Mezcal. It seems to fit our personality better, and who doesn’t like a good margarita after a great mix!

DaVinci Resolve Studio Used for Data Management on Shin Ultraman

DaVinci Resolve Studio was used for data management on the hit movie Shin Ultraman. A Blackmagic Pocket Cinema Camera 4K digital film camera, as well as an UltraStudio Mini Monitor playback device, were also used on the production.

Shin Ultraman is a film based on the popular tokusatsu (special effects) drama Ultraman, which was first broadcast in 1966. The film was planned, written and supervised by filmmaker/anime creator Hideaki Anno, an avowed fan of the Ultraman series. The film was directed by Shinji Higuchi, who has worked with Anno on numerous films, including the hit film Shin Godzilla. Shin Ultraman is an entertaining work that can be enjoyed by any generation, as it includes many scenes which are an homage to the original work.

DaVinci Resolve Studio was used for data management, QC and creating offline material during filming. “DaVinci Resolve is an easy and reliable data management hub. For data copying, I used Resolve’s clone tool, and I played back the copied footage in Resolve for quality checks. Since DaVinci Resolve is an application that can be used for finishing, I was more confident in my quality checks. I also used an UltraStudio Mini Monitor for monitoring, not only for this film but also for many other works,” said Takuto Watanabe, DIT of the film.

After quality checks, Watanabe created offline and rush material. Roughly one day’s worth of recorded data from the film was about one terabyte, and since it also included relatively low data footage from iPhones, it could be as much as four to five hours that were filmed.

Watanabe said, “In addition to the main cameras, we used a variety of other cameras for this film. And there are cameras that do not embed reel names in the metadata, so I put reel names on the footage from those cameras. In creating the offline material, I burned the timecode and other information onto the footage. DaVinci Resolve allowed me to set up custom settings for data burning very easily.”

Watanabe continued, “We started filming in 2019, but we needed to shoot additional material constantly. This film required many cameras as it needed to be shot with various angles and also used a Blackmagic Pocket Cinema Camera 4K for some tokusatsu shots. DaVinci Resolve was reliable as I could bring in my projects even after I updated the software to the latest version.”

Tsuburaya Production, a tokusatsu production company which is best known for producing the Ultraman series, released a number of short movies as part of the film’s promotion on its streaming platform Tsuburaya Imagination. The movies are called Shin Ultra Fight, and Watanabe completed post production, including color correction and online editing, using DaVinci Resolve Studio 18.

“These are a series of short movies using full CGI, and I did online editing for this project, not data management. What I mainly did was grading and adding some effects using ResolveFX. As I heavily added effects on CGI shots, I found artifacts on some shots, but I could make them smooth out by using beauty or deband effects in Resolve. I also used Resolve’s new magic mask feature, and I could track the kaiju and aliens perfectly! DaVinci Resolve allowed me to do complex work only in one application, which resulted in shortening my working hours,” said Watanabe.

“By using DaVinci Resolve, we could meet the director’s demands immediately. That let him make judgements more quickly and allowed us to work effectively,” he concluded.

Content sponsored by Blackmagic Design. 

IDC-LA Meets Modern Media Processing Demands with AJA and Diskover Media Edition

Preparing new and legacy entertainment for global audience distribution has grown increasingly complex between the advent of remote workflows and the growth of media file sizes. Add security demands into the mix, and post production and digital media processing services, like those provided by IDC, are imperative. The bi-coastal company serves clients worldwide.

IDC-LA COO Rosanna Marino and director of production engineering Mike Tosti sat down with AJA recently to talk about their efforts to automate data processing, making content delivery and organization more efficient and economical for customers.

Tell us more about IDC.
Rosanna: IDC was founded in New York in 1984 and in 2019, expanded to include an office in Los Angeles, where Mike and I work. It’s an exciting time for our business in this city, and we’re proud of the team and infrastructure we’ve built. We operate collaboratively with the New York office, but also work with our own client base and structure.

Our LA facility is home to a beautiful digital intermediate (DI) theater and Dolby Atmos suite, where we mix for clients. We also handle massive volumes of media processing, leaning on our stellar team as well as cloud-based tools and automation technology to streamline the work. Our work spans episodic series and films, and we do a lot of master QCs. Another facility might be working on color and then send us the master file for a final check. A lot can happen in the render that’s easy to miss. Our job is to find any issues and fix them before the file gets delivered to the client.

IDC-LA’s Rosanna Marino, Mike Tosti

Across projects, how much media are you processing on average a month?
Rosanna: We live in a world where it seems there’s never enough content to satisfy consumer demand. As a result, new content is always being made, and more content providers are working on getting legacy content up on platforms for video-on-demand (VOD) viewing. Although the amount of data we’re processing each month varies, it’s not unusual for us to deliver more than 20,000 files a month to clients around the world using different platforms. Thankfully, picking the right tools and technology to enhance our business has helped us meet this demand and compete with bigger players, all with a smaller footprint.

What makes IDC unique from other players in the space?
Rosanna: We don’t believe there’s one set way of doing things; instead, we aim to strike the right balance between thinking out-the-box and not reinventing the wheel. With simplicity and efficiency key, we’re continuously looking at new technologies like AJA Diskover Media Edition software that allow us to do things differently in a more streamlined way. Our approach to hiring is also unique in that we bring people on board to fill a specific need. We hand-picked everyone on this team for their specialties. With the help of Ryan, our director of emerging formats, we’re also working to automate tasks where possible.

What aspects of the job is IDC automating and why?
Rosanna: When projects come in, they go through a series of steps that will never change. Upon an asset’s arrival, we need to know where it lives and its contents. Using an intelligent workflow that we’ve built with Diskover Media Edition and MediaPulse, we’re able to automate this process. As a file arrives, we know it will land in a set folder, and we create a proxy. If the proxy passes, then it automatically goes into another folder, where another proxy is created. If the proxy fails, the system just stops. Our workflows dictate the initial path a file must take until a person steps in to set up profiles, create files, or complete QC.

Please describe your workflows in more detail.
Rosanna: We create profiles and access them in our Telestream Vantage system, which helps accelerate turnaround. To expedite media processing, we also use the Colorfront Transkoder, which is helpful when working with data-rich content. We use Diskover Media Edition in different ways depending on the client. As clients are onboarded, they work with Ryan and Seth from our team. Ryan designs a custom workflow on the operational end for us while Seth collaborates with the client, running through file specs and breaking them down with Ryan.

Mike: Most client files come in electronically and are fielded to a landing zone. The appropriate team members are then notified. Depending on where the files land, an automated backup might be generated in the cloud for disaster recovery. Our producers then use Diskover Media Edition to ensure files have arrived, signal to the team that it’s time to move the files to their next locations and notify the appropriate team members of their arrival. We knew Diskover Media Edition would be a great fit from the start, with the built-in automation tools and ties to MediaPulse. It’s more than lived up to our expectations. The software has made file scanning so quick, and it automatically indexes all our file systems every 30 minutes.

How has implementing these workflows changed your day-to-day?
Rosanna: Our customer service reps would be working blindly without them; they’d have to constantly ask the client where the files live. We have all the file information we need in a centralized, easy- to-access location — from the audio configuration to its contents. Whereas before, finding the file, sending it to an operator, pulling it up, and finding the required info took a lot of back and forth; it’s streamlined operations.

Mike: Prior to these workflows, I handled data management and was constantly getting notes from the team to confirm receipt of files or ask where they were. Those notes largely disappeared when we installed Diskover because of the software’s extensive ElasticSearch capabilities. It also allows us to pull up media information like run time, bit depth and frame rate, so teams can easily review metadata associated with the files they’re working on.

What role does ElasticSearch play in the workflow?
Mike: On the backend of our Diskover Media Edition index, our file system is scanned, and the contents are put into ElasticSearch. It’s an extensive, powerful search engine that lets you search for just about anything, from tags to file names or parts of file names. You can even search for a client, and the software will present everything from that studio or client, and you can narrow those results. The software is also scanning our cloud for disaster recovery, so our client service reps can search the cloud index, easily find an asset they need, and send in a work order to pull it back down from the cloud.

How is AJA Diskover Media Edition different from other tools you’ve used? 
Rosanna: Customer service is huge for us. AJA and Diskover are continuously improving the technology and are transparent about the product roadmap, so we know what to expect. We’ve found it so easy to connect with the Diskover team, share input that will benefit us and all users, and see progress. The conversation is two-way; they’re willing to explore new requests with us, which is impressive and rare today.

What industry shifts in recent years have impacted the way your team works?
Rosanna: Clients are entrusting their vendors to be their eyes and ears; there’s less in-person oversight than before. Pre-pandemic, clients would often pop in for a progress check, or we’d fly talent to the client’s location. Much of that work – including editorial sessions and spot checks — is now done remotely. Because of all this new technology and remote working, from an operational standpoint, our tools must be efficient and secure. Any technology we adopt must be safe for the work that we do and our infrastructure. We also do yearly audits, participate in neighborhood video calls on the subject, and host a lot of security awareness training, so the team is up to speed on the latest security matters.

What trends are you following? 
Rosanna: We’ve seen many clients evolve and buy each other out in the last few years and expect the trend to continue, so we’re keeping a close eye on mergers and acquisitions of studios and content platforms in the M&E space. It’s important for us to have our finger on the pulse of this growth, as it ultimately impacts our business.

Content sponsored by AJA. 

Dark Noise Completed with Fairlight Desktop Console and DaVinci Resolve Studio

The Australian thriller Dark Noise was completed using DaVinci Resolve Studio editing, color grading, visual effects and audio post production software, and the feature was shot using Blackmagic Design digital film cameras. Audio post was also completed using Fairlight Desktop Console audio control surface.

Dark Noise tells the story of a young woman who is sent audio recordings from her biologist father after he goes missing at a remote national park. She uses his audio clues to try and find him, but stumbles into a dangerous organized crime operation. Starring Imogen Sage, Callan Colley and Steve Le Marquand, and directed and written by Clara Chong, the film was shot in the wilds of Australia during the early days of the global pandemic.

Ben Allan was the cinematographer and colorist, while Chong also served as film and audio editor for the film. Allan is the youngest person to ever be awarded the ACS letters by the Australian Cinematographers Society, the only accredited cinematographer in the world to be awarded the CSI letters as a colorist by the Colorist Society International and has more than 1,500 film, television and commercial credits to his name. Chong has nearly thirty years’ experience in the industry after starting in Tokyo and training in New York.

For Dark Noise, the team used URSA Mini Pro 4.6K G2 and Pocket Cinema Camera 6K digital film cameras and DaVinci Resolve Studio for color correction, editing and audio work.

“Blackmagic products gave us the ability to see our vision through cohesively, from production to post. I don’t think we could have made this film without Blackmagic Design,” Allan said.

For production, a large part of the film was shot in dense forests in bad weather and at night, hours outside of Sydney, Australia. Allan, due to Covid restrictions, often had to shoot with a minimal crew or with only himself and Chong. The URSA Mini Pro 4.6K G2 and Pocket Cinema Camera 6K’s small form factors and ability to capture cinematic images in low light helped Allan get the shots he needed.

“The size of both cameras and the ability for one person to handle each one was a lifesaver. We were shooting an hour and a half away from any support, so the cameras’ reliability was another thing that I counted on,” Allan continued. “Also, I have a unique perspective being a colorist, as well as a DP and camera person. I know exactly what data I need in each shot, especially at night using minimal lighting where I need to know that the cameras will get me as much data as possible. Shooting in Blackmagic RAW got me what I needed with a huge amount of picture information but while using as little storage as possible.”

Throughout filming, Allan had post production, in particularly audio, in mind. The film’s protagonist was an audio professional using sound to investigate the disappearance of her father. The creative use of sound, in combination with color correction, is almost a character of its own in the film and needed to drive the story and viewers’ emotions. To do this, Allan expanded his post production studio to include the Fairlight Desktop Console and a DaVinci Resolve Mini Panel. “The footprint of both the Mini Panel for color and the Fairlight Desktop Console for sound really made it possible to have all of these controls within easy reach in a single studio for the first time ever.”

Dark Noise was the first film where Allan used Fairlight audio editing tools. This included the Fairlight Desktop Console, which is a compact surface featuring a bank of 12 touch sensitive faders and pan knobs, built in LCDs above each channel strip, channel control buttons, automation transport and navigation controls, and more. “Because Fairlight is designed from the ground up for film mixing, all the tools and processes you need are built right in,” Allan said. “Additionally, the Fairlight Desktop Console gives you the precise, tactile control you need for long form productions.”

With his own studio set up, Allan was able to work through the pandemic lockdowns to finish the film in time for a May 2022 release.

“Color and sound make an impact on the audience at the same level. They give an emotional and sensory reaction. In Dark Noise, this was especially true in the shadows. Manipulating sound, shadows and locations of sound in Resolve is amazing,” he said. “There was a moment when I was working in Fairlight and went ‘ah,’ we should do this with the color in the scene. I made the audio edit on the console and then with a single click jumped to the color, reached across to the Mini Panel and captured the matching visual inspiration I had for the scene right away. This is something that has never been possible before at such a high-quality level.”

“I loved working in Resolve on Dark Noise because it was so much easier to put color and sound in tune with each. The combined effect was amazing. Resolve has been able to fundamentally change how filmmakers create. With everything coming straight out of Resolve, the whole process is just amazingly efficient. Mixing and matching various resolutions and audio formats for different delivery requirements is so fast and effective. We are very glad we made the switch,” he said.

Dark Noise started screening in Australian theaters in May and will be available on streaming services worldwide.

Content sponsored by Blackmagic Design. 

Some of Summer 2022’s Biggest Films Created with Blackmagic Tools

More than 30 of the 2022 summer season’s worldwide film releases, such as Thor: Love and Thunder, Jurassic World Dominion and Bullet Train, were created with the Blackmagic Pocket Cinema Camera and URSA Mini Pro 4.6K G2 digital film cameras, DaVinci Resolve Studio editing, grading, visual effects and audio post production software, and more.

Highly anticipated films such as Elvis relied on Blackmagic Design gear throughout both production and post, with the film using Pocket Cinema Camera 6K for pick up shots and DaVinci Resolve Studio for on set grading, online editing and final color grading.

Summer blockbusters and breakout independent projects alike continued to use DaVinci Resolve Studio for their post production needs, including films such as Top Gun: Maverick, Nope, The Black Phone  and Crimes of the Future.

Summer films that used Blackmagic Design cameras include:

  • “Poser” DP Logan Floyd used URSA Mini Pro 4.6K G2s
  • “Three Headed Beast” Director and DP Fernando Andres used Pocket Cinema Camera 4K
  • “Watcher” DP Benjamin Kirk Nielsen used Pocket Cinema Camera 6K to capture VFX and news footage
  • “The Wrong Place” DP Peter Holland and Second Unit DP Laura Nolan used Pocket Cinema Camera 6K for extensive second unit photography

Summer films that used Blackmagic Design products for editing and VFX:

  • “Elvis” Editors Jonathan Redmond and Matt Villa used DaVinci Resolve Studio for the conform and grade, as well as UltraStudio capture and playback devices within their offline editing pipeline
  • “Endangered” VFX Supervisor Alex Noble of Wild Union Post used DaVinci Resolve Studio within his VFX pipeline
  • “Neptune Frost” Assistant Editor Skylar Zhang used DaVinci Resolve to create dailies as part of the editing pipeline
  • “Resurrection” VFX Supervisor Alex Noble of Wild Union Post used DaVinci Resolve Studio within his VFX pipeline
  • “Samaritan” VFX Supervisor David Lebensfeld of Ingenuity Studios used DaVinci Resolve Studio within the VFX pipeline
  • “Where the Crawdads Sing” Editor Alan Bell used UltraStudio HD Mini

Summer films that used DaVinci Resolve Studio for post production include:

  • “Allswell” graded by Nicholas Lareau of Nice Shoes
  • “Barbarian” graded by Sam Daley of Light Iron
  • “Beast” graded by Stefan Sonnenfeld of Company 3
  • “Billion Dollar Babies: The True Story of the Cabbage Patch Kids” co produced by NBCUniversal Syndication Studios and Believe Entertainment Group and graded by Sal Malfitano of Nice Shoes
  • “The Black Phone” graded by Supervising Colorist Nat Jencks of PostWorks New York and Additional Colorist Jason Fabbro of Picture Shop
  • “The Bob’s Burgers Movie” graded by Philip Beckner of FotoKem
  • “Bullet Train” graded by Dave Hussey of Company 3
  • “Cha Cha Real Smooth” graded by Nat Jencks of PostWorks New York
  • “Crimes of the Future” graded by Bill Ferwarda of Company 3
  • “Don’t Make Me Go” graded by Philip Beckner of FotoKem
  • “Elvis” graded by Colorists Kim Rene Bjørge and Kali Bateman and Supervising Colorist Brett Manson
  • “Emily the Criminal” graded by Walter Volpatto of Company 3
  • “The Janes” graded by Ken Sirulnick of Goldcrest
  • “Jurassic World Dominion” graded by Stefan Sonnenfeld of Company 3
  • “The Man from Toronto” graded by Stefan Sonnenfeld of Company 3
  • “Neptune Frost” graded by Catherine Pantazopoulos, and Nicolas Perret on behalf of Lobster Films
  • “Nope” graded by Greg Fisher of Company 3
  • “Poser” graded by Ori Segev using DaVinci Resolve Micro Panel
  • “Resurrection” graded by Nat Jencks of PostWorks New York
  • “Sharp Stick” graded by Nat Jencks of PostWorks New York
  • “Thor: Love and Thunder” graded by Jill Bogdanowicz of Company 3
  • “Top Gun: Maverick” graded by Stefan Sonnenfeld of Company 3

Summer films that used Blackmagic Design products for production include:

  • “Beast” DIT Lentsoe Mamatela used DaVinci Resolve, UltraStudio 4K Mini and DeckLink Mini Monitor playback card
  • “The Black Phone” DIT Jason Johnson used DaVinci Resolve Studio, Videohub routers and UltraStudios
  • “Emily the Criminal” DP Jeff Bierman used DaVinci Resolve Studio on set for grading select shots
  • “Flux Gourmet” DIT Richard Strong used DaVinci Resolve Studio for on set and dailies work
  • “The Forgiven” DIT Daniel Alexander of Digital Orchard used DaVinci Resolve Studio for on set and dailies work
  • “Samaritan” DIT Stuart Huggins used DaVinci Resolve, ATEM Mini Pro ISO live production switcher, MultiView 16 monitor, Smart Videohub CleanSwitch 12×12, DeckLinks and UltraStudios

Content sponsored by Blackmagic Design. 

Bringing the documentary Long Live Benjamin to life

By Dayna McCallum

The New York Times Op-Docs recently debuted Long Live Benjamin, a six-part episodic documentary directed by Jimm Lasser (Wieden & Kennedy) and Biff Butler (Rock Paper Scissors), and produced by Rock Paper Scissors Entertainment.

The film focuses on acclaimed portrait artist Allen Hirsch, who, while visiting his wife’s homeland of Venezuela, unexpectedly falls in love. The object of his affection — a deathly ill, orphaned newborn Capuchin monkey named Benjamin. After nursing Benjamin back to health and sneaking him into New York City, Hirsch finds his life, and his sense of self, forever changed by his adopted simian son.

We reached out to Lasser and Butler to learn more about this compelling project, the challenges they faced, and the unique story of how Long Live Benjamin came to life.

Long Live Benjamin

Benjamin sculpture, Long Live Benjamin

How did this project get started?
Lasser: I was living in Portland at the time. While in New York I went to visit Allen, who is my first cousin. I knew Benjamin when he was alive, and came by to pay my respects. When I entered Allen’s studio space, I saw his sculpture of Benjamin and the frozen corpse that was serving as his muse. Seeing this scene, I felt incredibly compelled to document what my cousin was going through. I had never made a film or thought of doing so, but I found myself renting a camera and staying the weekend to begin filming and asking Allen to share his story.

Butler: Jimm had shown up for a commercial edit bearing a bag of Mini DV tapes. We offered to transfer his material to a hard drive, and I guess the initial copy was never deleted from my own drive. Upon initial preview of the material, I have to say it all felt quirky and odd enough to be humorous; but when I took the liberty of watching the material at length, I witnessed an artist wrestling with his grief. I found this profound switch in takeaway so compelling that I wanted to see where a project like this might lead.

Can you describe your collaboration on the film?
Lasser: It began as a director/editor relationship, but it evolved. Because of my access to the Hirsch family, I shot the footage and lead the questioning with Allen. Biff began organizing and editing the footage. But as we began to develop the tone and feel of the storytelling, it became clear that he was as much a “director” of the story as I was.

Butler: In terms of advertising, Jimm is one of the smartest and discerning creatives I’ve had the pleasure of working with. I found myself having rather differing opinions to him, but I always learned something new and felt we came to stronger creative decisions because of such conflict. When the story of Allen and his monkey began unfolding in front of me, I was just as keen to foster this creative relationship as I was to build a movie.

Did the film change your working relationship?
Butler: As a commercial editor, it’s my job to carry a creative team’s hard work to the end of their laborious process — they conceive the idea, sell it through, get it made and trust me to glue the pieces together. I am of service to this, and it’s a privilege. When the footage I’d found on my hard drive started to take shape, and Jimm’s cousin began unloading his archive of paintings, photographs and home video on to us, it became a more involved endeavor. Years passed, as we’d get busy and leave things to gather dust for months here and there, and after a while it felt like this film was something that reflected both of our creative fingerprints.

Long Live Benjamin

Jimm Lasser, Long Live Benjamin

How did your professional experiences help or influence the project?
Lasser: Collaboration is central to the process of creating advertising. Being open to others is central to making great advertising. This process was a lot like film school. We both hadn’t ever done it, but we figured it out and found a way to work together.

Butler: Jimm and I enjoyed individual professional success during the years we spent on the project, and in hindsight I think this helped to reinforce the trust that was necessary in such a partnership.

What was the biggest technical challenge you faced?
Butler: The biggest challenge was just trying to get our schedules to line up. For a number of years we lived on opposite sides of the country, although there were three years where we both happened to live in New York at the same time. We found that the luxury of sitting was when the biggest creative strides happened. Most of the time, though, I would work on an edit, send to Jimm, and wait for him to give feedback. Then I’d be busy on something else when he’d send long detailed notes (and often new interviews to supplement the notes), and I would need to wait a while until I had the time to dig back in.

Technically speaking, the biggest issue might just be my use of Final Cut Pro 7. The film is made as a scrapbook from multiple sources, and quite simply Final Cut Pro doesn’t care much for this! Because we never really “set out” to “make a movie,” I had let the project grow somewhat unwieldy before realizing it needed to be organized as such.

Long Live Benjamin

Biff Butler, Long Live Benjamin

Can you detail your editorial workflow? What challenges did the varying media sources pose?
Butler: As I noted before, we didn’t set out to make a movie. I had about 10 tapes from Jimm and cut a short video just because I figured it’s not every day you get to edit someone’s monkey funeral. Cat videos this ain’t. Once Allen saw this, he would sporadically mail us photographs, newspaper clippings, VHS home videos, iPhone clips, anything and everything. Jimm and I were really just patching on to our initial short piece, until one day we realized we should start from scratch and make a movie.

As my preferred editing software is Final Cut Pro 7 (I’m old school, I guess), we stuck with it and just had to make sure the media was managed in a way that had all sources compressed to a common setting. It wasn’t really an issue, but needed some unraveling once we went to online conform. Due to our schedules, the process occurred in spurts. We’d make strides for a couple weeks, then leave it be for a month or so at a time. There was never a time where the project wasn’t in my backpack, however, and it proved to be my companion for over five years. If there was a day off, I would keep my blades sharp by cracking open the monkey movie and chipping away.

You shot the project as a continuous feature, and it is being shown now in episodic form. How does it feel to watch it as an episodic series?
Lasser: It works both ways, which I am very proud of. The longer form piece really lets you sink into Allen’s world. By the end of it, you feel Allen’s POV more deeply. I think not interrupting Alison Ables’ music allows the narrative to have a greater emotional connective tissue. I would bet there are more tears at the end of the longer format.

The episode form sharpened the narrative and made Allen’s story more digestible. I think that form makes it more open to a greater audience. Coming from advertising, I am used to respecting people’s attention spans, and telling stories in accessible forms.

How would you compare the documentary process to your commercial work? What surprised you?
Lasser: The executions of both are “storytelling,” but advertising has another layer of “marketing problem solving” that effects creative decisions. I was surprised how much Allen became a “client” in the process, since he was opening himself up so much. I had to keep his trust and assure him I was giving his story the dignity it deserved. It would have been easy to make his story into a joke.

Artist Allen Hirsch

Butler: It was my intention to never meet Allen until the movie was done, because I cherished that distance I had from him. In comparison to making a commercial, the key word here would be “truth.” The film is not selling anything. It’s not an advertisement for Allen, or monkeys, or art or New York. We certainly allowed our style to be influenced by Allen’s way of speaking, to sink deep into his mindset and point of view. Admittedly, I am very often bored by documentary features; there tends to be a good 20 minutes that is only there so it can be called “feature length” but totally disregards the attention span of the audience. On the flip side, there is an enjoyable challenge in commercial making where you are tasked to take the audience on a journey in only 60 seconds, and sometimes 30 or 15. I was surprised by how much I enjoyed being in control of what our audience felt and how they felt it.

What do you hope people will take away from the film?
Lasser: To me this is a portrait of an artist. His relationship with Benjamin is really an ingredient to his own artistic process. Too often we focus on the end product of an artist, but I was fascinated in the headspace that leads a creative person to create.

Butler: What I found most relatable in Allen’s journey was how much life seemed to happen “to” him. He did not set out to be the eccentric man with a monkey on his shoulders; it was through a deep connection with an animal that he found comfort and purpose. I hope people sympathize with Allen in this way.


To watch Long Live Benjamin, click here.

Atlanta VFX Uses Fusion Studio on Hulu Series Only Murders in the Building

Visual effects house Atlanta VFX used its Fusion Studio VFX and motion graphics software for the hit television series Only Murders in the Building. Leveraging Fusion Studio’s compositing tools to blend camera takes, replace green screens and handle shot replacements, Atlanta VFX left no clues when creating invisible VFX for the series’ second season.

Only Murders in the Building is a Hulu Original series, produced by 20th Television, a part of Disney Television Studios, that follows true crime podcasters that find themselves and their building at the center of a crime. In its second season, the show stars Steve Martin, Martin Short and Selena Gomez as neighbors in the swanky New York City apartment building the Arconia, who are now being accused themselves of a murder.

The team, led by Atlanta VFX Owner and VFX Supervisor Jason Maynard and Compositing Supervisor Jeremy Nelson, completed extensive greenscreen work throughout the season. “Between Fusion Studio’s Delta keyer, Ultra keyer and Primatte keyer, there’s always a keyer that will get the job done and make it seamlessly integrated,” said Maynard. “For example, in one episode we had more than 60 greenscreen shots in just the apartment building. For that episode, some of the actors were wearing shirts with green hues, so we had to use multiple keyers to properly isolate the colors.

“With the actors framed against a greenscreen that needed to be replaced with interior window shots, we used the Delta keyer as the main key, and then the Primatte keyer and Ultra keyer to complete the shot. With all the window reflections and smudges, it was more complex than just setting up a key and rendering the scene. Since one of the actors’ shirts included multiple shades of green, the Primatte keyer was crucial as its additive nature in selecting what it’s keying made it a perfect solution to finish the job.”

“For another episode, we had a tricky shot where Mabel (Gomez) has a flashback to her childhood, and they wanted to blend two camera pan takes into one take,” added Nelson. “Using Fusion Studio, we did 3D stabilization for the A plate and then projected the image onto the B plate for a seamless transition between the two takes.”

Blackmagic’s Fusion Studio’s 3D tracker was used to get rid of unwanted boom mics, crew, and camera reflections in windows. Additionally, Maynard noted its node-based workflow made complex scenes easier to manage, so they could move quicker throughout the various VFX sequences.

“It’s often the small details that matter when putting together a show like this. For example, we used Fusion Studio’s 3D displacement tools to swap out a letter on the entrance to the apartment building gate,” Nelson explained. “It needed a quick 3D track for the camera move, and a clean plate using the paint node. It was a fast replacement with the displacement tool, which is great for easy 3D modeling with alpha channels or roto shapes.”

Content sponsored by Blackmagic Design.