By Mike McCarthy
Back in 2016, NewTek announced its “Network Device Interface” (NDI) protocol as a way to transfer HD video signals in real time over IP networks. It was designed to replace SDI transmission and routing of high-fidelity video signals, usually within a facility. SDI was used to connect tape decks, computers and monitors to routers and switches, which would direct and process the signal. The vision was to replace all of that coaxial cabling infrastructure with IP-based Ethernet packets, running on the same networks and cabling that were already needed to support the data networks.
NDI is a compressed alternative to the SMTPE 2022 and 2110 standards, which require much more bandwidth for uncompressed video routing over 10GbE. NDI uses its own codec to compress video around 15:1, allowing it to be used over existing Gigabit networks. There is also an “HX” variant of NDI that uses a lower-bit-rate, H.264-based level of compression optimized for higher resolutions and wireless networks.
There are now a variety of products, both hardware and software, that use NDI to communicate between different stages of a workflow, usually targeted toward live production. But the software toolset that has been developed to allow those workflows offers a number of other interesting options when used slightly out of its intended context.
NewTek offers the basic NDI software toolset as a free download on its website. Among other things, it includes:
NDI Studio Monitor – This allows monitoring and recording of any available NDI stream on the network and can overlay streams together with other modifications and re-transmit to the network.
NDI Screen Capture — Captures the images from your desktop display(s) and outputs them as NDI channels on the network. Also supports KVM control over that channel on Windows.
NDI Screen Capture HX — Captures the images from your desktop display to a more compressed NDI HX stream via GPU-based NVENC for better performance, with KVM option.
NDI Virtual Input — Converts any NDI source into input as a webcam on your system, allowing that source to be streamed over video telecommunication apps.
NDI for Adobe Creative Cloud — Plugin that allows application output from Adobe’s Premiere Pro, After Effects or Character Animator to be transmitted as an NDI stream. Also allows import and editing of files from captured NDI streams.
Working Remotely
This software toolset offers a number of interesting workflow possibilities for remote groups or individuals working between multiple systems, or even on a single system. Basically, NDI sources take video data from an application, compress it into the NDI protocol and make it available to other resources on the network. NDI receivers, or viewers, take that data stream from the network or another application on the local system and display it or make it available in another program. NDI Virtual Input is a receiver tool that allows nearly any application that supports web camera input to use a live NDI feed. NDI Screen Capture turns your GPU output into an NDI feed that is available to any NDI receiver. The Creative Cloud plugin turns your Adobe application into an NDI source. The Studio Monitor can be both processing and displaying NDI feeds that are coming in and becoming an NDI source of the resulting data.
This makes it much easier to generate high-end tutorials and other screen capture content, combined in real time for streaming instead of editing them together in post. You can easily switch between the application UI and the program output when demonstrating a software function when using Character Animator to generate an avatar of your webcam input for gaming streamers.
There are also applications that allow you to use NDI on other devices, like HX Capture and HX Camera for iPhone. This allows streaming a screen or camera feed as an NDI source over the Wi-Fi connection and NDI Monitor TV, allowing you to view NDI streams via an Apple TV 4K. One of the key benefits of NDI is the widespread support, allowing otherwise incompatible products to pass images between each other.
NewTek sells a variety of products that enable and leverage NDI workflows, from its Spark SDI interfaces to its Tricaster production switching solutions. There are also other vendors, like BirdDog and Magewell, that make their own NDI-based hardware products, and companies like Sienna and vMix have integrated NDI into their software. It is also supported in applications like OBS, which I have been using it with. OBS supports bringing in any NDI feed as a source for your program and can also output its own NDI feed of the mixed stream.
Realtime Recording
While I do very little work in real time, I use SDI to capture NLE output for quick encodes and could use NDI for realtime encodes of my Character Animator projects. Setting NDI as the output in Mercury Transit and recording that stream in NDI Studio Monitor allows realtime recording of anything you can play back in real time in Premiere, After Effects or Character Animator, with alpha channel support. It can even overlay the output from one program over another. This allowed me to composite a Lego puppet I made over a screen capture or Premiere playback output in real time.
The Virtual Input function is especially interesting to me since so many people are working remotely now, which is something I’ve been doing for a while. Usually my boss and I, who are thousands of miles apart, just pass projects back and forth that link to duplicate media. I use VNC over a VPN to operate his system when needed, which offers limited collaboration, but the ability to combine NDI output from Premiere with the Virtual Input allows me to stream the output from my timeline to my boss over Skype for instant feedback of changes I make to a clip or sequence. He is seeing a compressed version of the output, but I have easier access to the software UI and better responsiveness than when using VNC.
By combining a number of these functions with OBS, and possibly a second system, I could make some great software tutorials in real time. Scan Converter allows my display output to be sent to a second system on the network, which when combined with a Character Animator render of my mini-figure avatar, and my Premiere or After Effects output (using the NDI output for Creative Cloud), a live stream can easily be created from those sources. This could theoretically be done on a single system with at least two displays, one for the demoed app and the other to control the switching and streaming, but NDI allows it to just as easily be run from a second system. I could dedicate a third system to Character Animator, if needed, and NDI would allow me to seamlessly link it in as long as they are all on the same network. Gaming streamers use a similar setup on Twitch, but NDI makes the whole process much easier than it otherwise would be.
Live-Animated Podcast
An even more interesting application of a number of these utilities would be to combine them across locations. I am looking at producing a live-animated podcast with someone else who would be remote. He would run his puppet in Character Animator and output over NDI to a virtual camera, which would be his source in Skype or Zoom. This is how you animate yourself for video calls, but we can take it a step further. If I connect to a call with him, I should then be able to screen-capture his feed via Region of Interest to an NDI feed on my system.
Once I have his puppet and background as an NDI feed, I can do the same thing in Character Animator with a puppet of my own and output it with no background as a separate NDI feed. I can then composite those two sources in Studio Monitor (or OBS, for more flexibility) and stream or record the result. The outcome of all this is that he and I discuss something as if we were on a teleconference call with each other, and an animated video of both of us in the same frame is created live, all through the magic of Character Animator, the internet and NDI.
It is easy to see that the flexibility of NDI opens all sorts of interesting workflow possibilities to explore in a variety of different applications. I am super-excited to see where it leads as new developments are released and more applications and features are supported.
Mike McCarthy is a technology consultant with extensive experience in the film post production. He started posting technology info and analysis at HD4PC in 2007. He broadened his focus with TechWithMikeFirst 10 years later.