NBCUni 9.5.23

Working in HDR: AJA’s Kona 5 I/O Card

By Mike McCarthy

AJA’s Kona 5 I/O card, has been available for two years and is the most recent in a long line of PCI expansion cards offering professional video interfaces for a variety of applications. While the card isn’t new, capabilities continue to be added to it through various software developments and firmware releases. The most recent of which is Adobe’s support of the Kona 5 as one of the few ways to monitor HDR content in Premiere Pro, over either SDI or HDMI.

With this update, many Premiere editors are going to have the opportunity to edit and view HDR content directly from their timelines on HDR displays for the first time. This will require an understanding of various new technologies and settings to get the best results for HDR workflows, which I intend to cover thoroughly as I work my way through the Kona card’s options and settings.

HDR (high dynamic range) and has been a big buzzword in television for years now, but there has been limited content available to highlight the impressive visuals that this technology offers. This is not due to a lack of matching camera technology, which has been available longer than the displays, but due to all of the workflow steps in between. Those intermediate steps have taken longer to develop and mature because increasing the dynamic range of electronic images while maintaining a consistent viewing experience in different mediums is much harder than increasing the resolution.

Increasing the resolution of a display system requires increasing the sampling frequency and bandwidth and shrinking the individual pixel components, but not much else. Moore’s Law has made increasing resolution relatively easy as everything made of silicon continues to shrink and operate more efficiently.

Increasing the dynamic range of an image requires more sensitive camera sensors and brighter displays that can also dim much darker for higher contrast. These are both hardware developments that have been available for a while, but unlike increasing resolution, which just requires more processing power, wider color spaces require bigger changes to software and the development of methods to convert between the available color spaces. This is further complicated by the fact that increasing the brightness of a displayed color increases its perceived saturation. That effect impacts different color wavelengths to varying degrees, leading to shifts in certain hues when brightness is increased. These perceptual differences, as well as a few others, must be compensated for when converting images between high and standard dynamic range formats.

Colorspace
Technically, the jump from SD (standard definition) to HD (high definition) involved the switch from the Rec. 601 color-space standard to the Rec. 709 color-space standard, which was such a small change that most people — even those working in that business — couldn’t tell the difference. The jump from HD to UHD (ultra-high definition) or 4K was supposed to involve a change to the Rec. 2020 color space, which offers a noticeably wider potential palette of colors.

But in practice, much 4K content was edited and finished at Rec. 709 and converted to Rec. 2020 in a final step before display, thereby not taking advantage of the newer capabilities. Rec. 2100 is the next step, laying out the color-space standards for HDR content in HD, UHD and even 8K. This new color space has the same primary color values as Rec. 2020 but allows a much wider range of brightness possibilities. And Rec. 2100 includes two totally separate approaches to how this can be done, with the mutually exclusive optical transfer functions of perceptual quantization (PQ) or Hybrid Log Gamma (HLG).

AJA ControlPanel
In the HDR section of the AJA control panel, users can set the colorimetry to SDR (Rec. 709), Rec. 2020, P3 or custom settings. They can also set the transfer function to SDR (gamma), PQ or HLG. Transfer functions are methods of mapping the digital values in a video file or signal to actual physical light levels that the viewer will see. For standard-def video, the gamma curve is the transfer function and designed to emulate human vision to a degree. Humans don’t see light in a linear fashion, which is why an 18% grey card looks half as bright as a white one. This effect continues into the higher brightness values, necessitating more advanced ways to quantify those light values efficiently. Enter PQ and HLG transfer functions.

PQ is a nonlinear electro-optical transfer function that allows for the display of HDR content with brightness values up to 10K nits using the Rec. 2020 color space. PQ was codified as SMTPE 2084 and is the underlying basis for the “HDR10” standard. HDR10 is an open television standard that combines the Rec. 2020 color space, 10-bit color and the PQ transfer function with other frame and scene metadata related to average and maximum light levels. An HDR10 display uses this metadata, combined with info on its own unique hardware brightness levels, to interpret the incoming image signal in a way that maximizes the dynamic range of the displayed image while remaining consistent to viewers across different types of displays.

HLG is also a nonlinear electro-optical transfer function, but a simpler one that uses the standard gamma curve for the darker half of the spectrum and switches to a logarithmic curve for the brighter side of the spectrum. This allows HLG content to be viewed directly on SDR monitors, with a slight change in how the highlights look and with increased detail at the cost of slightly lower maximum brightness. But on an HLG-aware HDR display, the brighter portions of the image have their brightness values logarithmically stretched to take advantage of the display’s greater dynamic range.

Also, greater dynamic range means a greater visual difference between dark and light areas of the spectrum, with more values in between. Digitally reproducing that greater range of values requires greater bit depth. So 8-bit gamma encoded images can reproduce a range of about six stops without visible banding. Trying to display an 8-bit image with higher contrast than that results in visible steps between the individual colors, usually in the form of bands around bright points on the screen. Increasing the bit depth to 10-bit gives four times as many levels per channel and 64 times as many possible colors, extending that range to about 10 stops. Using a more complex transfer function, those same 10 bits can be used to display over 17 stops of range without visible banding if the display is bright enough. So all HDR content is stored and processed with at least 10 bits of color depth, regardless of whether it is PQ or HLG.

HDR content can be transmitted over HDMI connections or SDI cables. HDMI 2.0 allowed 4Kp60 over a single cable. The 2.0a version added support for HDR video in the form of PQ-based HDR10. The 2.0b version added support for HLG-based HDR content, and the Kona 5 supports both of these options from its HDMI 2.0b port. Users can choose between RGB and YUV output and 8-, 10- or 12-bit color output. I have been testing primarily at YUV-10 HLG at 4Kp24 because that is what most of my source footage is.

The Kona card has four bidirectional 12G-SDI ports, supporting HD frame sizes to 8Kp60 with quad 12G output. 12G-SDI has the bandwidth to send a 4Kp24 RGB signal over a single cable, or it can be divided across four 3G-SDI channels. When using a single 12G-SDI cable to output 4K, the Kona 5 can also output a separate 2K down-converted 3G signal on the other SDI monitoring output, allowing the use of older HD displays.

When using quad-channel SDI, two different approaches can be taken to divide the data between the channels. The first is to br­­­­­­­­­eak the frame into quadrants and send each zone on a separate cable. The second is 2-sample interleave (2SI), which alternates streams every two pixels, resulting in each cable carrying a viewable quarter-resolution image. Regardless of which approach is taken, a 4K signal is divided into four 3G HD streams, and 8K is divided into four 12G 4K streams. The AJA control panel allows user control over these modes in the SDI output section when needed. My monitor allowed me to successfully test the Kona 5’s HDMI output, quad 3G-SDI output and 12G-SDI output at 4K, as well as quad 12G output at 8K.

8K output from the Kona 5 card involves a separate firmware or bit file installation, which replaces the 4K color space conversion and keying functionality with 8K image processing. This is easy to do in AJA’s control panel app but requires a full shutdown of the system to activate that feature. It is easy to imagine that future cards will support 8K output without reflashing the card.

AJA also includes a media capture and playback application with its software package called AJA Control Room. It offers a number of file and format options and now fully supports HDR workflows in both PQ and HLG. Since Control Room operates the Kona card in YUV mode, it supports 8K playback up to 60fps over quad 12G-SDI, which my display is able to accept and output. I was able to get 8K output to display from Premiere Pro, but it dropped frames during ProRes playback, even at 24p.

Using AJA’s Control Room software, I was able to get smooth 8K playback of ProRes HDR files at both 24p and 60p, which is quite an impressive accomplishment on my older-generation workstation. I plan to run more tests on a newer system in the near future to see if Premiere can perform better with newer hardware.

Currently, in order to monitor 50p or 60p HDR sequences in Premiere Pro over SDI, the output must be set to 10-bit HLG.  Editing at 24p or 30p offers the additional options of 12-bit PQ and 12-bit HLG output if your HDR display supports those modes.

PQ or HLG
At this point, HDR content can be stored and transmitted in PQ or HLG formats, and each approach has varying levels of support in different devices and environments. Streaming services (Netflix, Amazon) lean toward PQ-based HDR10 because they can serve up a separate SDR data stream to nonHDR devices. It’s simpler to deal with metadata in on-demand scenarios than in live environments where HLG may be favored, as it doesn’t require metadata to describe the image. Ultra HD Blu-ray discs are also encoded using PQ-based formats. Broadcast and satellite TV providers (BBC, DirecTV) favor HLG-based content because it can still be viewed directly on older SDR displays without any further processing.

Working in HDR requires using files that support at least 10-bit color and HDR color spaces. The most popular format will likely be Apple’s ProRes, which supports both PQ and HLG content. Other options include Sony’s X-AVC format for 10-bit HLG content and JPEG2000 in 12-bit PQ format. Both H.264 and HEVC have the possibility of storing HDR content in either form, depending on the encoder. Adobe currently only supports encoding those two formats to PQ as HDR10 in Media Encoder, but other options should continue to become available in the future. I expect HLG to be more popular as a distribution format, especially in the short term, due to its inherent compatibility with a wider array of playback devices.


Mike McCarthy is an online editor/workflow consultant with over 10 years of experience on feature films and commercials. He has been involved in pioneering new solutions for tapeless workflows, DSLR filmmaking and multi-screen and surround video experiences. Check out his site.


Leave a Reply

Your email address will not be published. Required fields are marked *

I accept the Privacy Policy

This site uses Akismet to reduce spam. Learn how your comment data is processed.