NBCUni 9.5.23

Review: Nvidia’s Founder’s Edition RTX 4090 — An Editor’s Perspective

By Brady Betzel

Nvidia has released its much-anticipated RTX 4090 GPU. It’s big and power-hungry, and I’ll provide more details on that later in this piece. When the product was released, I initially held off filing this review to see if Nvidia or Blackmagic (who showed a prerelease version of Resolve with AV1 encoding technology that only works with the new 4090 series) would release any Easter eggs, but so far it hasn’t happened.

Whether they do or not, I plan on doing a more-in-depth review once I’ve settled in and found the RTX 4090 sweet spots that will help editors and colorists. But for now, there are still some gems in the RTX 4090 that are worth checking out. (For a tech guru perspective, check out Mike McCarthy’s review.)

Founder’s Edition
The Nvidia GeForce RTX 4090 GPU comes in a few different flavors and iterations. I was sent the Founder’s Edition, which features the new Ada Lovelace architecture, two FP32 streaming processors, the DLSS 3 platform, 16,384 CUDA cores, 24GB GDDR6X memory, 2.23 base clock speed with up to 2.52 boost clock speeds, and much more. You can find more in-depth technical specs on the Nvidia site, where you can also compare previous versions of Nvidia GPUs.

In this review, I am focusing on features that directly relate to video editors and colorists. For the most part, the RTX 4090 performance is as expected, with a generational improvement over the RTX 3090. It’s faster and contains new updates, like DLSS 3 (an artificial intelligence-powered performance booster). Those features are typically gaming-focused and embrace technologies like optical flow to “create” higher resolutions and frames to increase frame rates. That doesn’t typically mean much for us post nerds, unless you also play games, but with artificial intelligence-adapted features becoming so prevalent, we are beginning to see speed increases in editing apps as well.

Resolve Prerelease
As editors, we need faster rendering, faster exporting and more efficient decoding of high-resolution media. We always hear about 8K or 4K, but you don’t always hear how much computing and GPU power you need to play these large resolutions back in real time, especially when you are editing with CPU/GPU-hogging codecs like Red R3D, H.264 and more.

Inside of DaVinci Resolve 18, I was able to playback all my standard testing files in real time without any effects on them. From UHD ProRes files to UHD Red R3D files, the RTX 4090 handled all of them. Even when I played back ProRes, 8K UHD (7680×4320) files I was pleasantly surprised at the smooth, real-time playback. All the files played back without using cache files, proxy files or pre-rendered media.

Magic Mask

Keep in mind I was using a prerelease version of Blackmagic’s DaVinci Resolve (mentioned earlier) to harness the power of AV1 encoding. And AV1 is the real gem in the updated Nvidia RTX 4090 GPU architecture. This is why I mentioned “prerelease” in the last sentence. I’ve heard through the grapevine that a newly updated Resolve will be released sometime this fall and will include some of the features I’m about to go into. But for now, I’m sorry. That’s all I’ve got in terms of a release date.

So what is AV1? Think of the old tried-and-true H.264 and H.265 codecs but with a 30% smaller file size for equivalent quality. Without getting too far into the weeds on AV1, the AV1 codec came about when a group of big companies like Intel, Nvidia, Google, etc. wanted to create a royalty-free video codec that had the same quality but was more efficient than HEVC-based codecs, such as H.264 and H.265. That is how the AV1 codec was born. AV1 is still on the ground floor, but with large companies like Nvidia adding new features such as AV1-compatible dual-encoders, and with nonlinear editing apps like Resolve including encoding abilities, it will soon hit the mainstream.

Face refinement

Nvidia really took the bull by the horns on AV1, and Blackmagic followed along. In the prerelease version of Resolve that I used, I encoded the included 4K (UHD) 30fps ProRes 422 HQ clip provided by Nvidia, which has a run time of about 2 minutes and 7 seconds, to the new AV1 codec in about 17 seconds. And since no other card can export AV1 files yet, there is really no benchmark for me to compare to. However, I did export the same sequence to an H.265 encoded file using an Nvidia Quadro A6000 GPU, and that took about 39 seconds. I was kind of surprised given that the A6000 contains double the memory and costs over double the price, but when I looked deeper into it, it made sense.

The RTX 4090 is a much newer card with much newer technology, including over 6,000 more CUDA cores. But for a pro who needs the extended memory range; a compact, two-slot design; and half the power consumption, the Quadro A6000 will fit better (literally). The RTX 4090 is physically large and takes up three slots.

Scene Detect

AI, Editing and Color
Remember a bit earlier when I mentioned AI technology and how it’s creeping its way into the tech that video editors and colorists use? And while RTX 4090 is more of a gamers card, there are a few very specific updates that video editors and colorists will like? One of them inside Resolve 18’s prerelease is Magic Mask. I’ve used it before, and it is very good, but it’s also time-consuming, especially if you don’t have a very fast CPU/GPU. Lucky for us, the RTX 4090 has dramatically improved Magic Mask processing speeds. Nvidia reports that the time difference in its testing was 29 seconds for the RTX 3090, 17 seconds using the RTX 4090 and 34 seconds using the Quadro A6000. Some other AI-improved features of Resolve 18 are Scene Detect, Super Scale and Optical Flow.

The Nvidia RTX 4090 has shown increased efficiency when compared to the Quadro A6000. Besides the increased memory size, Frame Lock and Genlock are the standout features of the A6000 that are going to matter to users looking to decide between the two GPUs. For media creators, the RTX 4090 is a phenomenal GPU that will dramatically decrease export times, media processing times, effects render times and much more, which directly correlates to the “time is money” adage.

Power Needs, Cost
The RTX 4090 is a power-hungry beast, straight up. It needs three PCI slots, three power inputs and a beefy power supply. The statistics say that the RTX 4090 requires 450W of power versus 320W for the RTX 3090. And in overall system power, the RTX 4090 requires at least 850W, while the 3090 requires 750W. If you aren’t familiar with the RTX 3090 style of PCIe cards, both the 3090 and the 4090 require either three PCIe eight-pin cables or one 450W or greater PCIe Gen 5 cable. So you should probably aim for a power supply capable of producing at least 1,000 watts, keeping in mind that any other IO cards you are supporting will also add to the power bill.

Retail pricing for the RTX 4090 starts at $1,599. It’s not cheap, so if you have an RTX 3090 and don’t care about the AV1 encoding feature (whether for Resolve or for streaming apps like OBS), then you might be able to hold off on purchasing one. But if you are like me and want the latest and greatest, the Nvidia RTX 4090 will be the GPU to get. And if you are thinking about getting into other avenues of media — say, Unreal Engine or 3D modeling in apps like Maxon Cinema 4D with Otoy’s OctaneRender — you’ll find the RTX 4090 embraces those apps and even adds special features, such as denoising optimizations.

Watch this space for an Nvidia RTX 4090 follow-up review.


Brady Betzel is an Emmy-nominated online editor at Margarita Mix in Hollywood, working on shows like Life Below Zero and Uninterrupted: The Shop . He is also a member of the Producers Guild of America. You can email Brady at bradybetzel@gmail.com. Follow him on Twitter @allbetzroff.

 


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.