Blogs

Introduction to Video Latency

2022.01.20

In the field of video transmission, latency is an inevitable phenomenon which can have a significant impact on production workflows and the final viewing experience. This blog will introduce the basic concept of latency, its importance, factors that can impact it, and some ways to effectively reduce it.

What is Video Latency?

Video latency is essentially the time required for live video to travel from its source to its destination. It is often measured as "end-to-end" latency – the time required for a single frame of video to travel from the camera to the final display. This time varies greatly depending on the transmission mechanisms used. For broadcast television, typical latency can be between 5 and 10 seconds, while live streaming can have latencies of 30 seconds or higher. For real-time video production and more demanding or interactive streaming applications – such as gaming and sports betting – latency of less than one second is often important.

Why Does Latency Matter?

The degree of latency that is acceptable to users will vary depending on the application scenario. While high video latency in some situations may create a bad user experience – such as the oft-cited example of hearing your neighbor cheering for a football touchdown they saw on TV 30 seconds before you see it on your stream – in other cases low latency is not very important. For example, when distributing non-time-sensitive video content in one direction to a large audience (such as streaming a music or artistic performance), higher latency may be an acceptable tradeoff in order to achieve better video quality. Similarly, intermediate processing between broadcast production and playout – such as adding live captions, or content protection to prevent piracy – also adds to the end-to-end latency, but may be both necessary and acceptable.

For live production and time-sensitive streaming, however, the latency must be reduced as much as possible to ensure excellent interactivity and the best viewing experience. Common application scenarios where low latency is needed include:

  • Bi-directional interviews: To keep the conversation smooth and natural, both parties in different locations need to keep the video stream at a very low latency. Otherwise, the two speakers may end up talking over each other because they aren't aware the other has already started talking.
  • Remote production: Since the on-site camera and remote production staff are in different locations, the production staff need to obtain the feed with low latency to ensure seamless collaboration.
  • Remote control: For video equipment operators such as remote control of PTZ cameras or computer screens, the latency of video streams must be as low as possible to ensure accurate operation.
  • Security and public safety: High latency could mean that by the time personnel see an incident on the video, they have lost many precious seconds in their response time.

What Impacts Video Latency?

With any transmission workflow, a lot of components and factors have an impact on the end-to-end latency. While the latency of any individual component may be small on its own, they cumulatively can have a considerable effect. Some of the main contributors to video latency include:

  • The transmission protocol being used, such as a streaming protocol. Protocols that require extensive “handshaking” and error checking as packets are transmitted typically incur higher latency. Protocols and technologies such as RTMP, NDI® and SRT all have different latency characteristics.
  • The network type (such as a LAN versus the public Internet) and how the transmitting and receiving devices are connected to the network (such as wired Ethernet, Wi-Fi or 4G/5G).
  • The total transmission distance. For example, streaming between two different continents will have higher latency than streaming within the same city, as the streams must traverse more network hops.
  • Each device that processes the video. From the camera, switcher and encoder to the decoder and final display – and any intermediate processing steps in between – each device in the video transmission workflow adds latency. Notably, software-based processing (such as encoding and decoding) generally has considerably higher latency than hardware-based equivalents.

Ways to Reduce End-to-End Video Latency

There are many ways to minimize the video latency without reducing the video quality. While some of the above factors (such as distance) are out of users’ control, there are two effective ways to reduce latency that users often can implement:

  1. Choose a combination of a hardware encoder and hardware decoder. Compared with software solutions, hardware encoders and decoders have dedicated resources to more efficiently perform the encoding and decoding tasks. They are also not subject to system scheduling and CPU occupation by other programs that would be running on a host computer. This greatly reduces the latency while ensuring stable, reliable video transmission. For more information, please refer to our previous blog Latency Comparison Test of Software and Hardware Encoding & Decoding.
  2. Choose the appropriate delivery technology or protocol. For transmitting video signals across a LAN for video production or internal distribution, NDI® is recommended for its ability to transport high-quality video, audio, and metadata locally with very low latency. (The end-to-end transmission latency for NDI® using Magewell Pro Convert devices is only about 50ms). If you need to deliver video signals over the Internet, it is recommended to choose SRT, which enables secure, reliable, and low-latency delivery of high-quality video between multiple locations even over unpredictable networks like the public internet.