Welcome to the new Mux Docs (currently in beta). The old version is still available here
- Introduction to Video
- Stream video files
- Start live streaming
- Make API requests
- Play your videos
- Listen for webhooks
- Secure video playback
- Create clips from your videos
- Get images from a video
- Create timeline hover previews
- Adjust audio levels
- Add watermarks to your videos
- Add subtitles to your videos
- Download your videos
- Minimize processing time
- Upload files directly
- Autoplay your videos
- Integrate with your CMS
Reduce live stream latency
This guide covers types of latency, causes of latency, reconnect windows, and reduced latency.
Mux Video live streaming is built with RTMP ingest and HLS delivery. Standard glass-to-glass latency for RTMP ingest is around 30 seconds.
To clarify some terminology and industry jargon:
- Glass-to-glass latency: This is the latency we are talking about, and this is usually when people are talking about when casually discussing "latency". This is the total time it takes from content to travel from the lens of the camera to the viewer's screen.
- Wall-clock time: Also might be referred to as "realtime". If you have a clock on the wall where you are capturing video content, this would be the time on that clock.
The nature of HLS delivery means that clients are not necessarily synchronized. Some clients might be 20 seconds behind wall-clock time and others might be 30 seconds behind.
Where does the latency come from?
You don't have to worry about these gritty details when using Mux live streams, but to give you an idea of how a live stream works:

- Captured by a camera
- Processed by an encoder - If the computer running the encoder is running out of CPU this process can get behind and start lagging.
- Send to an RTMP ingest server - This server is ingesting the video content in real-time. This part is called the "first mile", it's happening over the internet, often times on consumer or cellular network connections so things like TCP packet-loss and random network disconnects are always happening.
- Ingest server decodes and encodes - Assuming all the content is traveling over the internet fast enough, the encoder on the other end needs to keep up and have enough CPU available to package up segments of video as they come in. The encoder has to ingest video, build up a buffer of content and then start decoding, processing and encoding for HLS delivery.
- Manifest files and segments of video delivered - After all of that, files are created and delivered over HTTP through multiple CDNs to reach end users. Each file becomes available after the entire segments worth of data is ready. This part also happens over the internet where the same risks around packet-loss and network congestion are factors. Network issues are especially a factor for the last mile of delivery to the end user.
- Decoded and played on the client - When video makes it all the way to the client. the player has to decode and playback the video. Players do not play each segment on the screen as they receive it, they keep a buffer of playable video in-memory which also contributes to the glass-to-glass latency experienced by the end user.
When you consider each of the steps above, any point of that pipeline has the potential to slow down or get backed-up. The more latency you can tolerate, the safer the system is and the lower probability you have for an unhappy viewer. If any single step gets backed up momentarily, the whole system has a chance to catch up before an interruption in playback. And, when everything is running smoothly the player has extra time that it can spend downloading the higher quality version of your content.
Reconnect windows
When an end-user is streaming from their encoder to Mux, we need to know how to handle situations when the client disconnects unexpectedly.
There are situations when a client disconnects on purpose: for example hitting "Stop streaming" on OBS. Those are intentional disconnects, we're talking about times when the client just stops sending video. In order to handle this, live streams have a reconnect_window
. After an unexpected disconnect, Mux will keep the live stream "active" for the given period of time and wait for the client to re-connect and start streaming again.
When the reconnect_window
expires the live stream will transition back into the idle
state. In HLS terminology, Mux will write the #EXT-X-ENDLIST
tag to the HLS manifest. At this point, your player will consider the live stream to be over.
By default reconnect_window
is 60
(seconds) - you can set this as high as 300
(5 minutes).
Reduced latency
Mux live streams have an option for "reduced latency". When reduced_latency
is enabled then we will treat your stream a little differently in order to minimize latency. This will get your latency down to about 15 seconds. Again, the resulting glass-to-glass latency is dependent on the player so you might see some variance there.
These are two big tradeoffs to consider when using the reduced latency feature:
No reconnect windows - without reconnect windows, if your encoder disconnects for any reason, the live stream will transition to the
idle
. A corresponding asset will be created as the live stream segment recording and your player will see that the HLS stream has ended (#EXT-X-ENDLIST
tag). If the encoder later reconnects, you have to make sure your player is properly reloading to start streaming from your stream URL.Less stability - because of the tradeoffs mentioned above with regards to latency,
reduced_latency
mode is inherently less stable than standard latency.
Requirements for using the reduced_latency
flag
You should only use this flag if you have control over the following:
- the encoder software
- the hardware the encoder software is running on
- the network the encoder software is connected to
Typically, home networks in cities and mobile connections are not stable enough to reliably use the reduced_latency
flag.
How to create a live stream with the reduced_latency
option
Check out our Live Stream API ReferenceAPI and find the reduced_latencuy flag and include "reduced_latency": true
in the request to create a live stream.
Here is how to add the flag to a live stream by making this POST request:
POST https://api.mux.com/video/v1/live-streams { "reduced_latency": true, "playback_policy": ["public"], "new_asset_settings": { "playback_policy": ["public"] } }
You can read more about the reduced latency feature in this blog post about Reduced Latency.
What about Low-Latency HLS?
Apple has announced a preliminary low-latency HLS spec that will allow us to delivery sub-5 second latency with HLS delivery. The details are still being figured out, and support for this updated spec will require both server-side and player-side changes.
We have written extensively about the Low-Latency HLS and the development a few times on our blog in June 2019 and again in January 2020.