By Mike McCarthy
Adobe Max was held online this year, just like most other annual conferences have been in 2020. It was also offered for free, a vast departure from previous years, but Adobe has tried to preserve as much of the experience as possible in the new online form.
The schedule was similar to past years, kicking off with a big keynote presentation of what Adobe’s product teams have been developing over the past year. It was hosted by Conan O’Brien. The keynote was followed by segments that were less technical and more creative, profiling various artists and their experiences and inspirations. These were grouped by fields, from photography to graphic design to video production, and played as a continuous stream after the keynote.
I made sure to tune in for Jason Levine’s video-focused hour with four video-focused artists. Normally at the events I attend that are more technically focused, these articles would include details about how new technological developments are helping these artists realize their visions better than ever before, but not for Adobe Max.
I saw a session about Kelli Anderson, who took folding paper to a whole new level in some very impressive and functional ways, and another from director Taika Waititi, who shared his irreverent take on art and the creative process. You never know what to expect from those segments, but they do help you look at things from new perspectives.
Max Sneaks
There was also the traditional Max Sneaks presentation, in which various Adobe developers showed off functionality they have been working on but has not yet made its way into shipping products. This year we saw music-based video retiming with “on the beat,” shared interactive AR experiences with “AR Together,” brush-based font creation with Typographic Brushes and physics-based intersection prevention for 3D objects with “Physics Whiz,” to name a few.
Updates
Along with the Max event, Adobe showed updates in nearly every Creative Cloud program. Premiere has been getting many significant updates throughout the year, including the new Productions project organization framework and HDR support, so the only new thing we saw at Max is a set of new captioning tools that eventually will be driven by an AI-based speech-to-text engine. (Check out Brady Betzel’s piece on updates.)
After Effects is getting RotoBrush 2 for AI-enhanced rotoscoping, which is a significant improvement over previous methods. After Effects is also getting a host of new enhancements to make it easier to composite in 3D space.
Character Animator is getting AI-enhanced lip sync and facial animations based on audio, as well as improvements to timeline management and limb IK (or inverse kinematics). Photoshop probably has the most new features, with a number of AI-powered improvements, sky replacement and neural filters for serious AI enhancements to photographs. These photo-modifying tools are counterbalanced by the Content Authenticity Initiative, which tracks how photos have been edited.
For workflows on the move, Illustrator is coming to iPad, and the ability to livestream applications from iPad and Fresco is coming to iPhone. This surprised me because without a stylus, that is a small screen to do precision work with a fingertip. Adobe also released support for Fresco on other PC devices, including my ZBook x2, back in August. This gives me support on my X2 for everything iPad users can do in Creative Cloud and so much more. But if you are an iPad user, Adobe is constantly adding more functionality to your device.
One of the other features that Adobe had previously announced — but is still in the process of weaving through its line of products — is application live-streaming. Adobe introduced this capability a while back through the iPad apps, and its timing couldn’t have been better. With the pandemic and lockdowns, there were many artists looking for new ways to express themselves and connect with others, and lots of people had more time than normal on their hands. So there is now a community of artists on Behance.net who livestream their work, primarily in Photoshop, Illustrator and Fresco.
When this livestreaming initiative was first announced last year, I didn’t understand it to be targeting those applications, and I was intrigued about the possibilities and the potential copyright and NDA implications of using it on video workflows. So I attended a session dedicated to in-application livestreaming during Max and learned a bit more about it. Currently, livestreaming is only integrated into the iPad apps, but users can stream the desktop apps to Behance through the usual tools like OBS. Desktop streaming is not allowed by default, and you have to request access through Behance.
When I attempted to sign up to stream some Premiere Pro workflow ideas I have, it literally asked me if I was going to stream Photoshop or Fresco, so they are clearly targeting those apps. But Adobe has developed a really cool integration with Tool Timeline, a system of tracking exactly what tools and settings streamers are using when they are working. It tracks the document history metadata in those applications, and artists can navigate the recordings through that log. There’s a plugin for Photoshop on desktop that allows those settings and functions to be tracked the same way that the integrated streaming does in the iPad version.
It would be interesting to see if they expand that to other apps in the future. It would also be interesting to see an integration with GeForce Experience, which can stream to other services, like Twitch and YouTube, so artists get support for Behance in a “GeForce Creators Experience” version of that software. This would allow users to stream their entire workflow, since the primary strength of the Creative Cloud apps is in how they can all work together so seamlessly.
Streaming for an individual application would never be able to communicate an entire media workflow. Now this is how it is done currently in OBS, but a Creators variant of GeForce Experience wouldn’t constantly try to get me to install gaming drivers instead of Nvidia’s Studio drivers, and it let me know when a new Studio driver was released, as 456.71 was this week to coincide with Max.
HP Z Series
Also coinciding with Max is HP’s announcement of new displays in its top-end Z series lineup. There are eight different displays, sized between 24 and 27 inches, that will ship early next year. They all have much thinner cabinets and are made of recycled materials. HP also claims to be able limit blue-light output to protect your vision without affecting the perceived colors, via HP’s Eye Ease feature. It doesn’t seem like limiting blue light should even be possible because blue is an important part of the light spectrum, but concern about blue light in electronic devices has really taken off in the past few years.
Four of the displays offer USB-C connections and are unique in having a built-in network interface card, giving laptop users a single connection to plug in for a high-res display, USB peripherals, power charging and, now, hard-wired LAN support. The NIC also allows for tracking various statistics about the displays themselves even when they’re not connected to a host system — a useful tool for larger enterprises.
Two of HP’s new displays will be in the premium DreamColor line with true 10-bit color panels and high dynamic range. HDR has both video and VESA standards and, from a video perspective, computer displays run HDR10 instead of HLG, but that doesn’t necessarily mean that they are in the expected Rec. 2020 color space. VESA DisplayHDR has various brightness tiers, and these will be HDR 600-certified, which is a mid-level brightness for HDR. But they are listed as supporting sRGB, DCI-P3, and Rec. 709 color spaces, so I am not clear about how that would work for people who are playing with HDR video, which is in the Rec. 2020 or Rec. 2100 color spaces.
Either way, I am looking forward to trying them out when the time comes, and they should pair well with my ZBook x2, delivering 100 watts of power, connecting to my LAN and giving me 10-bit HDR color at UHD resolution to match my integrated DreamColor display.
Applications and Use
But Max isn’t just about seeing what new software and hardware updates are now available. There are also breakout sessions on application design and usage, creativity and a variety of other topics. Some of these were interactive live streams from teams of people who work with the software, while others were prerecorded tutorials about how to use various features.
Previously, these were all taught live in front of an audience of attendees. In those cases, the creator is available to chat with reviewers during the scheduled playback, but I feel something is missing from the live interaction that used to take place at the in-person conference, where we were all sitting in the room watching the presenter manipulate the software in front of us. The program is never going to crash in a prerecorded demonstration, and the render and processing time can be cut out for efficiency, but it is a less authentic representation of the software usage experience.
Livestreaming has a greater risk of things going wrong, but that risk is what strengthens that connection to what or who the viewer is watching. Adobe seems to recognize the value of that realtime connection with artists as the basis of its application livestreaming tools, but it hasn’t applied that connection to these breakout sessions. Prerecorded or not, there are all sorts of things to be learned from the sessions at Max. Most of the video-related sessions are fairly introductory, presumably by design due to Max’s target audience.
Besides watching most of the mainstream creativity sessions, I focused on learning more about After Effects because there is always more to learn about AE, and Adobe keeps adding more to it. I use AE frequently, but based on the tasks I work on, I usually use it in the same way I have for years, rarely dabbling in the new features. But the 3D Camera Tracker, Content-Aware Fill and the new RotoBrush 2 are all things I could use some more instruction about.
Adobe put on an impressive event, considering it was the first time it has attempted to do Max online, but I am a bit online-conferenced out and looking forward to returning to doing these things in person, or at least in real time, instead of watching it like a TV channel. But I am looking forward to trying out the software Adobe showed off and have a few new ideas I want to try out after listening to the various creative speakers.
Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.