Video content has been getting a lot of love these past couple of years. Some of it is just for fun, but businesses are learning to rely on video to help them attract and engage an audience. That’s a sentiment 87% of marketers could agree with. It shouldn't surprise you that there’s a need for faster, better, and more immediate video.
In live streaming, low latency critically influences the sense of immediacy of the video. Some types of live streams might need it more than others, but it’s still one of the key technical characteristics you need to understand about live streaming. So let’s see what low latency is, when you need it, and how to achieve it.
What is latency in video streaming?
Let’s get straight to it: latency means “delay.” When you want to send information from point A to point B, the latency is the time it takes the information to appear at point B after leaving point A. That is, at least, the simplest way to understand it.
Video latency, as a more specific application of the term, is used to describe the time between the capturing of a frame and the end user having it displayed on their screen. So, if you’re streaming live video, latency means that your viewers will never see you exactly in real time — they will always see you with delay.
What is low latency?
We express latency in units of time. If the delay between you making the footage and it appearing on viewers’ screens is two seconds, we say that the streaming has a latency of two seconds. Whether that particular value is good or bad — low enough or too high — is a whole other question.
The thing with latency is that there are no standards that govern what is “high” and what is “low.” What we think of when we say “low” latency is usually “low when compared with the average in that field of broadcasting.”
Online video streaming has a wide latency range, with higher values resting between 30 and 60 seconds. To give you a reference of how high these values are, surveys show that more than half of video developers expect to achieve latencies of less than five seconds. That’s what’s considered “low latency” in live streaming.
When is low latency especially required?
Low latency comes with tradeoffs. If you’d like to broadcast your video live on YouTube, for example, you’d have to make some adjustments. To get what YouTube considers “low latency,” you’d have to give up streaming in 4k resolutions. For YouTube’s version of ultra-low latency, you won’t be able to live stream in 1440p, either.
Still, there are a number of cases where the tradeoff would be worth it. Some live streams wouldn’t lose any of their charm with standard latency. Some, however, would be unimaginable at anything but low or ultra low. Here are the situations where low latency is called for:
- Streams that require two-way communication: If you’re live streaming a Q&A session and you plan to take questions from the audience, you should aim for a low latency. Audiences will expect it, and it will facilitate better interaction.
- Online video games: Online video games must reflect the action in real time on the player's display. Any lag between the action and its display on the screen will compromise the gameplay and gaming experience.
- Online casinos and sports betting: A short transmission time or low latency enables the players to gamble in real time, or as close to it as possible. This reduces the chance that someone will have the upper hand thanks to a lower latency.
- Live auctions: Remote bidders can participate in a live auction right from their homes through video conferencing or video chatting. It's imperative to have low latency live streaming so that bidders can participate with the people present at the real location.
- Video chat: For video chatting solutions like Skype, a simple lag can cause temporary breakdowns in communication. Fast transmission of data and low latency are necessary so that the people on both sides can have a seamless and uninterrupted conversation.
For the regular stream where you don’t plan to interact with your audience, you can afford not to think about achieving a low latency. But the more interactive your content becomes, the more timeliness becomes important. Involving monetary transactions in it, like you’d do with live auctions or betting, makes ultra-low latency paramount.
Important factors affecting latency
In your quest to achieve a low latency for your live streams, you’ll have to come against some limitations that are beyond your control. Latency is affected by several factors. You might be able to deal with some, but others could be too expensive or impractical to change.
Here are some of the most important things that can influence your live streaming latency:
- Bandwidth: Higher bandwidths mean faster internet and less congestion. As a result, your data travels faster, as you have a bigger pipe or bandwidth.
- Connection type: The type of connection affects data transmission rates and speeds. Optic fibers, for example, transmit video faster than wireless internet.
- Encoding: A lot depends on the encoder, and it needs to be optimized to send a signal to the receiving device with as little delay as possible.
- Video format: Larger file sizes mean that it will take longer to transmit the file via the internet, increasing the streaming latency.
- Distance: Your videos can have an increased delay if you are located far away from your ISP, internet hub, or satellites.
Read Next: What is a good upload speed for streaming? 📍
You can do a lot to reduce the latency of your live streams simply by changing encoder settings, internet service providers, or the type of connection. But the real elephants in the room here are the streaming protocols and the role they play in providing a good streaming experience.
Streaming protocols to deliver low-latency video streams
A streaming protocol is a set of rules that govern how data goes from the point of its origin to its destination. When you’re live streaming video, these protocols are there to ensure that your encoder and the streaming service are on the same page when it comes to exchanging information.
In some cases, you’ll be able to pick the streaming protocol you use. Usually, however, it depends on which protocols are supported by the encoders and the platforms to which you’re streaming. Let’s see what are some of the more widely used protocols today.
Web Real-Time Communication (WebRTC) was developed by Google for sub-second latency data exchange between browsers. The open-source protocol released in 2011 found use in peer-to-peer video chat solutions like Google Hangout.
The protocol is ideal for real-time data transfer and video conferencing. But you may have to compromise a bit on video quality, as speed is the main focus. You also need a complex server setup to deploy WebRTC. For this reason, most CDNs are not compatible with WebRTC at the present moment.
Some WebRTC streaming solutions use the cloud to convert live video streams to WebRTC.
Real-Time Messaging Protocol (RTMP) was Macromedia's solution for low latency communication. The protocol breaks data into chunks to transmit audio and video signals consistently. There are several variations of RTMP that cater to different kinds of connections.
RTMP was initially difficult to scale, but the advent of cloud technologies has solved the problem. Now you can achieve low latencies using RTMP to deliver your videos with great speed.
Many CDNs have now stopped the support for RTMP after the demise of Flash player.
HLS & DASH
HLS and DASH are alternatives to WebRTC and can achieve latencies of five seconds. HLS and DASH construct streamable video segments after processing the raw video material. The video parts are incorporated into a shipping container (CMAF) before being sent to the end user. The smaller the video segments, the lower your latency.
CDNs forward the viewing request to the encoder, which then sends the appropriate segments to the viewing device. The use of CMAF, or Common Media Application Format, can further reduce the latency to one second, theoretically.
Update: FTL was developed by the streaming platform Mixer, owned by Microsoft. Unfortunately, Mixer was shut down due to inability to scale in comparison with its competitors. However, we believe the understanding of this protocol can be useful. Therefore, we keep the information about the technology.
FTL, or Faster Than Light, is Mixer's own low latency streaming protocol. It's specially developed to support interactive videos, where users can interact in real time with the content. You can enjoy almost nil delays when you broadcast to Mixer channels.
To use FTL, you will need a reliable network connection and Mixer compatibility. Otherwise, you should stick with RTMP or other protocols.
Multistreaming with low latency
If you want to stream to multiple platforms at the same time, your need for low latency takes on a whole new level. You are broadcasting to several channels at once and need something reliable to cut down the delays.
When you use a multistreaming service such as Restream, you’re technically not communicating with the platforms where your audiences are watching the stream. You are only communicating with the multistreaming service’s server and letting the server communicate with the platforms.
That’s why it’s important to use a streaming service that supports the fastest possible streaming protocols that are also supported by the platforms. Usually, this means communicating with the service’s server via an RTMP protocol and using the service to communicate to platforms via the best available protocol. All of this requires minimal involvement from you.
Video latency is a crucial factor for anyone who wants to do high-performance live streaming, whether they are marketers, entrepreneurs, or gamblers and gamers. If you stream, you need to achieve a low latency so that your viewers have a pleasing viewing experience without any interruptions.
The best way to cut your video delays is to use a fitting protocol like WebRTC. If you’re multistreaming, you should ensure that you’re using the fastest possible protocol to stream to the multistreaming service. From there, you can let Restream do the rest.