Latency
What is latency?
Latency is the time it takes for information to move from point A to B. It’s expressed in units of time, usually milliseconds. Video latency, more specifically, is used to describe the time between the capturing of a frame and the end user having it displayed on their screen.
So, if you’re streaming live video, latency means that your viewers will never see you exactly in real time — they will always see you with a delay. When it comes to latency, there are no standards that govern what is “high” and what is “low.” Latency that’s “low” is simply lower when compared with average latency speeds.
Online video streaming has a wide latency range, with higher values resting between 1 and 18 seconds. Most online hosting platforms and video streaming services consider low latency between 2-6 seconds, and ultra-low latency between 0.2-2 seconds.
Factors affecting latency
Latency is affected by several factors. You might be able to deal with some, but others could be too expensive or impractical to change.
Here are some of the most important things that can influence your live streaming latency:
- Bandwidth: Higher bandwidths mean faster internet and less congestion. As a result, your data travels faster, as you have a bigger pipe or bandwidth.
- Connection type: The type of connection affects data transmission rates and speeds. Optic fibers, for example, transmit video faster than wireless internet.
- Encoding: A lot depends on the encoder, and it needs to be optimized to send a signal to the receiving device with as little delay as possible.
- Video format: Larger file sizes mean that it will take longer to transmit the file via the internet, increasing the streaming latency.
- Distance: Your videos can have an increased delay if you are located far away from your ISP, internet hub or satellites.
You can reduce the latency of your live streams simply by changing encoder settings, internet service providers or the type of connection. But streaming protocols play a key role in providing a good streaming experience.
Use cases for low latency
Low latency comes with tradeoffs. To get low latency on YouTube, for example, you’d have to give up streaming in 4K resolution. For YouTube’s version of ultra-low latency, you won’t be able to live stream in 1440p, either.
For some types of live streams, this tradeoff is worth it, though. The following types of broadcasts are better off with low or ultra-low latency:
- Streams that require two-way communication: If you’re live streaming a Q&A session and you plan to take questions from the audience, you should aim for a low latency. Audiences will expect it, and it will facilitate better interaction.
- Online video games: Online video games must reflect the action in real time on the player's display. Any lag between the action and its display on the screen will compromise the gameplay and gaming experience.
- Online casinos and sports betting: A short transmission time or low latency enables the players to gamble in real time, or as close to it as possible. This reduces the chance that someone will have the upper hand thanks to a lower latency.
- Live auctions: Remote bidders can participate in a live auction right from their homes through video conferencing or video chatting. It's imperative to have low latency live streaming so that bidders can participate with the people present at the real location.
- Video chat: For video chatting solutions like Skype, a simple lag can cause temporary breakdowns in communication. Fast transmission of data and low latency are necessary so that the people on both sides can have a seamless and uninterrupted conversation.
The more interactive your content becomes, the more timeliness becomes important.