Skip to Content
Mux Docs: Home

Add subtitles/captions to videos

Learn how to add subtitles or captions to your videos for accessibility and multi-language support.

Introduction to subtitles and captions

Subtitles and captions allow for text overlays on a video to be shown at a specified time. First, let's clarify these two terms which are often used interchangeably.

  • Subtitles refers to text on screen for translation purposes.
  • Captions refers to text on screen for use by deaf and hard of hearing audiences. If you see text like [crowd cheers], you are seeing captions on your screen.

In any case, Mux supports both in the form of WebVTT or SRT and these files can be human or computer generated. From Mux's perspective these files are converted into "text tracks" associated with the asset. If the text track provided is captions then supply the attribute closed_captions: true when creating the text track.

The rest of this guide will use the term "subtitles" to refer to adding text tracks that can be either subtitles or captions.

How to add subtitles to your video

You can add subtitles to any video asset in Mux. To add subtitles, you will need to provide either a SRT or WebVTT file containing the subtitle information to the Mux API.

Here's an example of what a WebVTT file looks like:

00:28.000 --> 00:30.000 position:90% align:right size:35%
...you have your robotics, and I
just want to be awesome in space.
00:31.000 --> 00:33.000 position:90% align:right size:35%
Why don't you just admit that
you're freaked out by my robot hand?

Create a subtitle track

When you create an assetAPI in Mux, you can also include text tracks as part of the input. There's no limit on the number of tracks you can include when you make the request.

The first input in your array of inputs must be the video file. After that, the caption tracks should be appended to the list, each including the source URL to the caption track, plus additional metadata. Here's an example of the order to use here:

{
"input": [
{
"url": "{VIDEO_INPUT_URL}"
},
{
"url": "https://tears-of-steel-subtitles.s3.amazonaws.com/tears-en.vtt",
"type": "text",
"text_type": "subtitles",
"closed_captions": false,
"language_code": "en",
"name": "English"
},
{
"url": "https://tears-of-steel-subtitles.s3.amazonaws.com/tears-fr.vtt",
"type": "text",
"text_type": "subtitles",
"closed_captions": false,
"language_code": "fr",
"name": "Française"
}
],
"playback_policy": [
"public"
],
"encoding_tier": "baseline"
}

This will enable WebVTT subtitles in the stream URL, which can then be used by many different players.

You can also add text tracks using the create asset trackAPI. This can be helpful for adding captions to live stream recordings once they have finished, or if you need to update or remove additional languages for a video after it was first added to Mux.

Showing subtitles by default

To show subtitles by default, you can include an additional playback modifier with the HLS stream request like this:

https://stream.mux.com/{PLAYBACK_ID}.m3u8?default_subtitles_lang=en

The default_subtitles_lang playback modifier requires a valid BCP-47 language value to set the DEFAULT attribute value to YES for that language. If there's no exact language match, the closest match of the same language is selected.

For instance, subtitles text track with language en-US is selected for default_subtitles_lang=en. This helps with regional variations and gives more flexibility.

Video players play the default text track for autoplaying videos even when muted.

Using default_subtitles_lang with signed URLs

If you are using signed playback URLs make sure you include the extra parameter in your signed token.

Accessibility

The A11Y project is a community-driven effort to make digital accessibility easier and includes checking videos for accessibility.

With Mux videos, the jsx-a11y/media-has-caption rule fails because it looks for a <track> attribute on the player. However, Mux videos include subtitles with HLS manifest when you request the stream. If you have added text tracks to your Mux videos you can safely disable this linting rule and still provide accessible video.

Workflow for generating subtitles

You may want to generate subtitle tracks for your Mux assets. These might be machine generated or human-generated by yourself or a 3rd party. Some example third-party services you might use to do this are Rev.com and Simon Says.

Using static renditions and webhooks from Mux, your automated flow might look like this:

  1. Create a Mux asset (either with a Direct Upload, an input parameter, or the recording of a live stream).
  2. Add mp4_support to your asset either at asset creation time or add mp4_support to your asset if it is already created. See Download your videos guide for details about how to do this.
  3. Wait for the video.asset.static_renditions.ready webhook. This lets you know that the mp4 rendition(s) are now available.
  4. Fire off a request to the 3rd party you are using for creating subtitles. You should pass along the mp4 file and get back either an SRT file or WebVTT file when the subtitle track is ready.
  5. Wait for the subtitle track to be ready, when it is, make an API request to add this text track to your asset, as described above.

Was this page helpful?