By Oliver Peters
Whether you’re a guitar nerd or just into rock ‘n roll history, learning what makes our music heroes tick is always entertaining. Music journalist and TV presenter Kylie Olsson started a YouTube channel during the pandemic lockdown, teaching herself how to play guitar and reaching out to famous guitarists that she she knew. This became the concept for a TV series called Life in Six Strings With Kylie Olsson that airs on AXS TV. The show is in the style of Comedians in Cars Getting Coffee with Olsson exploring the passions behind these guitarists, plus gets a few guitar pointers along the way.

James Tonkin
I spoke with James Tonkin and Leigh Brooks about the post workflow for these episodes. Tonkin is founder of Hangman in London, which handled the post on the eight-part series. He was also the director of photography for the first two episodes and has handled the online edit and color grading for all of the episodes. Leigh Brooks Firebelly Films was the offline (i.e. creative) editor on the series, starting with episode three. Together they have pioneered an offline-to-online post workflow.
Let’s find out more…
James, how did you get started on this project?
James Tonkin: Kylie approached us about shooting a pilot for the series. We filmed that in Nashville with Joe Bonamassa and it formed the creative style for the show. We didn’t want to just fixate on the technical side of the guitar and tone of these players, but their geographical base — we wanted to explore the city a little bit. We had to shoot it very documentary style but wrap it up into a 20-25 minute episode. No pre-lighting, just a tiny team following her around, interacting with these people.
Then we did a second one with Nuno Bettencourt and that solidified the look of the show during those two initial episodes. She eventually got distribution through AXS TV in the States for the eight-part series. I shot the first two episodes, and the rest were shot by a US-based crew, which followed the production workflow that we had set up. Not only the look and retaining the documentary format, but also maintaining the highest production value we could give it in the time and budget that we’re working with.
We chose to shoot anamorphic with a cinematic aspect ratio, because it’s slightly different from the usual off-the-cuff reality TV look. Also whenever possible, record in a raw codec, because we (Hangman) were doing all of the post on it, and me specifically being the colorist.
I always advocate for a raw workflow, especially something in a documentary style. People are walking from daylight into somebody’s house and then down to a basement, basically following them around. And Kylie wants to keep interacting with whomever she’s interviewing without needing to wait for cameras to stop and rebalance. She wants to keep it flowing. So when it comes to posting that, you’ve got a much more robust digital negative to work with [if it was shot as camera raw].

Leigh Brooks
What was the workflow for the shows and were there any challenges?
Leigh Brooks: The series was shot mainly with Red and Canon cameras as 6K anamorphic files. Usually, the drive came to me, and I would transcode the rushes or create proxy files and then send the drive to James. The program is quite straightforward and narrative-based, without much scope for doing crazy things with it.
It’s about the nuts and bolts of guitars and the players that use them. But each episode definitely had its own little flavor and style. Once we locked the show, James took the sequence, got hold of the rushes and then got to work on the grade and the sound.
What Kylie’s pulled off on her own is no small feat. She’s a great producer, knows her stuff and really does the research. She’s so passionate about the music and the people that she’s interviewing and that really comes across. The Steve Vai episode was awesome. He’s very holistic. These people dictate the narrative and tell you where the edit is going to go. Mick Mars was also really good fun. That was the trickiest show to do because the A- and B-side camera set-up wasn’t quite working for us. We had to really get clever in the edit.
Resolve is known for its finishing and color grading tool, but you used it to edit the offline as well. Why?
Tonkin: I’ve been a longtime advocate of working inside of Resolve, not just from a grading perspective, but editorial. As soon as the Edit page started to offer me the feature set that we needed, it became a no-brainer that we should do all of our offline in Resolve whenever possible.
On a show like this, I’ve got about six hours of online time and I want to spend the majority being as creative as I can. So, focusing on color correction, looking at anything I need to stabilize, resize, any tracking, any kind of corrective work — rather than spending two or three hours conforming from one timeline into another.
The offline on this series was done in Resolve, except for the first episode, which was cut in Apple Final Cut Pro X. I’m trying to leave editors open to the choice of the application they like to use. My gentlemen’s agreement with Matt [Cronin], who cut the first pilot, was that he could cut it in whatever he liked, as long as he gave me back a .drp (DaVinci Resolve project) file. He loves Final Cut Pro X because that’s what he’s quickest at. But he also knows the pain that conforms can be. So he handled that on his side and just gave me back a .drp file. So it was quick and easy.
From Episode 3 onwards, I was delighted to know that Leigh was based in Resolve, as well, as his primary workflow. Everything just transfers and translates really quickly. Knowing that we had six more episodes to work through together, I suggested things that would help us a lot, both for picture for me and for audio as well, which was also being done here in our studio. We’re generating the 5.1 mix.
Brooks: I come from an Avid background. I was an engineer initially before ever starting to edit. When I started editing, I moved from Avid to Final Cut Pro 7 and then back to Avid, after which I made the push to go to Resolve. It’s a joy to edit on and does so many things really well. It’s become my absolute workhorse. Avid is fine in a multi-user operation, but now that doesn’t really matter. Resolve does it so well with the cloud management, and I own the two editor keyboards.
You mentioned cloud. Was any of that a factor in the post on Life in Six Strings?
Tonkin: Initially, when Leigh was reversioning the first two episodes for AXS TV, we were using his Blackmagic Cloud account. But for the rest of the episodes, we were just exchanging files. Rushes either came to me or would go straight to Leigh. He makes his offline cut and then the files come to me for finishing, so it was a linear progression.
However, I worked on a pilot for another project where every version was effectively a finished online version. And so we used Blackmagic Cloud for that all the way through. The editor worked offline with proxies in Resolve. We worked from the same cloud project and every time he had finished, I would log in and switch the files from proxy to camera originals with a single click. That was literally all we had to do in terms of an offline-to-online workflow.
Brooks: I’m working on delivering a feature-length documentary for [the band] Nickelback that’s coming out in cinemas later in March. I directed it, cut it in Avid, and then finished in Resolve. My grader is in Portsmouth, and I can sit here and watch that grade being done live, thanks to the cloud management. It definitely has a few snags, but they’re on it. I can phone up Blackmagic and get a voice — an actual person to talk to that really wants to fix my problem.
You’ve both worked with a variety of other nonlinear editing applications. How do you see the industry changing?
Tonkin: Being in post for a couple of decades now and using Final Cut Studio, Final Cut Pro X and a bit of Premiere Pro throughout the years, I find that the transition from offline to online starts to blur more and more these days. Clients watching their first pass want to get a good sense of what it should look like with a lot of finishing elements in place already. So you’re effectively doing these finishing things right at the beginning.
It’s really advantageous when you’re doing both in Resolve. When you offline in a different NLE, not all of that data is transferred or correctly converted between applications. By both of us working in Resolve, even simple things you wouldn’t think of, like timeline markers, come through. Maybe he’s had some clips that need extra work. He can leave a marker for me and that will translate through. You can fudge your way through one episode using different systems, but if you’re going to do at least six or eight of them — and we’re hopefully looking at a season two this year — then you want to really establish your workflow upfront just to make things more straightforward.
Brooks: Editing has changed so much over the years. When I became an engineer, it was linear and nonlinear, right? I was working on the James Bond film, The World Is Not Enough, around 1998. One side of the room was conventional — Steenbeck’s, bins, numbering machines. The other side was Avid Media Composer. We were viewing 2K rushes on film, because that’s what you can see on the screen. On Avid it was AVR-77. It’s really interesting to see it come full circle. Now with Resolve, you’re seeing what you need to see rather than something that’s subpar.
I’d say there are a lot of editors who are “Resolve curious.” If you’re in Premiere Pro you’re not moving [to a different system], because you’re too tied into the way Adobe’s apps work. If you know Premiere, you know After Effects and are not going to move to Resolve and relearn Fusion. I think more people would move from Avid to Resolve, because simple things in Resolve are very complicated in Avid — the effects tab, the 3D warp and so on.
Editors often have quite strange egos. I find the incessant arguing between platforms is just insane. It’s this playground kind of argument about bloody software! [laugh] After all, these tools are all there to tell stories.
Oliver Peters is an award-winning editor/colorist working in commercials, corporate communications, television shows and films.







And one of the great things about the point where we took over is that Episode 5 is its own little capsule episode. We tried to shoot some of the stuff on the base in a similar tone to how they were shooting it. But then, once we got to that monster mission, it became its own thing, and we shot it in our own way. Then, with Episode 6, we were in completely different spaces. It’s a real break from the previous episodes because it’s the midpoint of the season, we’re away from the base, and there’s a big shift in terms of where the story is going. That gave us a little bit of freedom to very consciously shift how we were going to approach the visual language with Jac. It was an organic way to make that change without it feeling like a weird break in the season.
On Captain Marvel, so often actors are just responding to tennis balls and an AD running around the set for eyelines. In this case, it was nice for the actors to see an actual airplane on fire outside their window for their performances to feel fresh.
Boden: I think that we had a lot of the same reference points when we first started talking, like The Cold Blue, an amazing documentary with a lot of footage that was taken up in the planes during World War II. Filmmakers actually were shooting up there with the young men who were on missions in these bomber planes. That was a really important reference point for us in terms of determining where the cameras can be mounted inside one of these planes. We tried as much as possible to keep those very real camera positions on the missions so that it felt as reality-based and as visceral as possible and not like a Marvel movie. We used some of the color palette from that documentary as well.
Was it a really tough shoot?
Typically, on our films, we’re involved in all the other post departments, visual effects and sound, every step of the way. But on this series, we were less involved, although we gave notes. Then Jac did all the grading and the rest of the show. She kind of took over and was very involved. She’ll have a lot of insights into the whole DI process. (See Sidebar)
What were the big editing challenges for both episodes? Just walk us through it a bit.
How hands on were Spielberg and Hanks, or did they let you do your own thing?
Boden: Some days we’d show up and suddenly find out an hour into the day that we weren’t going to get an actor that we were planning to shoot with, so we’d have to rearrange the day and try to shoot without that actor. That was a big challenge.









Are you very involved in that? 



Securing the Palme d’Or at the Cannes Film Festival and clinching five Oscar nominations, Anatomy of a Fall is a gripping family saga unraveling the startling collapse of an ordinary household. Under the helm of Justine Triet, her fourth directorial venture paints a dizzying portrayal of a woman accused of her husband’s murder, set amidst a suffocating ambiance. Graded at
“I translated that vision alongside Simon’s directives into the visuals, meticulously attending to facial expressions and skin tones,” she continues. “We closely collaborated in crafting a visual identity, starting with extensive camera trials during preproduction involving hair, makeup and costumes.”

The producers called on Chicago’s
Working with designers and animators, including Ariel Costa, Matthew Nowak, Jens Mebes, Dan Moore, João Vaz Oliveira, Mollie Davis, and Andrei Popa, the Sarofsky team also developed a second visual theme: using hands and shadows in their storytelling. “Hands play an essential role in the series as Maya’s means of communicating using ASL – and in the telling of the creation story of the Choctaw Nation, which is told using shadow puppets in the series,” says Draht. “Developing these visual motifs amplified the core story and characters while allowing us to add meaning and tone. We use shadows to express history, danger and Maya’s ancestral connections.”


Did you do a lot of research?
What were the main challenges of shooting this? 
We adjusted stuff in the DI, but it wasn’t a big departure from the dailies. We matched all the shots for continuity. The way we work is that Yvan does his pass first to match it all, and then if I want the scene to be darker, there’s an offset for everything since it already matches. That makes the DI work pretty simple. It also gives us time to do a window here, a window there.

For its part, Zoic used a broad spectrum of software and tools but mainly relied on SideFX Houdini, Autodesk Maya and Foundry Nuke to achieve the majority of the VFX lift.
When it came to sound design, “The call was for futuristic sci-fi characters in an Old West environment,” says Dusty Albertz of Howdy Sound. “I think we succeeded in crafting a soundscape that is both believable and fun.”




Unfortunately, it was too complicated to have a session with Evan and Laura at the same time, but we managed to keep a good level of communication using review links to share the progress.


The deliverables for this Google Mint Pixel 8 launch included the aforementioned 


