The Mux Developer Hub

Welcome to the Mux developer hub. You'll find comprehensive guides and documentation to help you start working with Mux as quickly as possible, as well as support if you get stuck. Let's jump right in!

Get Started

Direct Upload

Allow your users to upload content directly to Mux.

Direct Uploads allow you to provide an authenticated upload URL to your client applications so content can be uploaded directly to Mux without needing any intermediary steps. You still get to control who gets an authenticated URL, how long it's viable, and, of course, the Asset settings used when the upload is complete.

The most common use-case for Direct Uploads is in client applications, such as native mobile apps and the browser, but you could also use them to upload directly from your server or in a command line tool. Any time you don't feel the need to store the original on your own, just generate a signed URL and push the content directly.

Let's start by walking through the simplest use case of getting a file directly into Mux.

1. Create an authenticated Mux URL

The first step is creating a new Direct Upload with the Mux Asset settings you want. The Mux API will return an authenticated URL that you can use directly in your client apps, as well as an ID specific to that Direct Upload so you can check the status later via the API.

curl \
  -X POST \
  -H "Content-Type: application/json" \
  -d '{ "new_asset_settings": { "playback_policy": ["public"] } }'
  cors_origin: '', 
  new_asset_settings: {
    playback_policy: 'public'
}).then(upload => {
  // upload.url is what you'll want to return to your client.

2. Use that URL to upload in your client

Once you've got an upload object, you'll use the authenticated URL it includes to make a PUT request that includes the file in the body. The URL is resumable, which means if it's a really large file you can send your file in pieces and pause/resume at will.

curl -v -X PUT -T myawesomevideo.mp4 $URL_FROM_STEP_ONE
const fs = require('fs');
const request = require('request');

const uploadUrl = /* Authenticated URL from step 1 */
const UpChunk = require('@mux/upchunk');

// This assumes there's an <input id="file-picker" type="file" /> on the page.
const filePicker = document.getElementById('file-picker');

const url = /* the URL from step 1. */

filePicker.onchange = function () {
  const file = filePicker.files[0];

  const upload = UpChunk.createUpload({
    // Normally this would be retrieved via an API request to an endpoint
    // you control that would return an authenticated URL.
    endpoint: url
  upload.on('success', () => console.log('We did it, everyone!'));

If you were following along with these examples, you should find new Assets in the Mux Dashboard with the settings you specified in the original upload create request, but the video you uploaded in the second step!

Using Direct Uploads in your application

The examples above are a great way to upload a one-off file into Mux, but let's talk about how this workflow looks in your actual application. Typically you're going to want to do a few things:

  • Authenticate the request that gives the user a signed URL so random people don't start ingesting Assets into your Mux account.
  • Save information in your application about the file when the user creates the upload, such as who uploaded it and when, details about the video like title, tags, etc.
  • Make sure the Asset that's ultimately created from that upload is associated with that information.

Just like Assets, Direct Uploads have their own events, and then the Asset created off the upload has the usual events as well. When you receive the video.upload.asset_created event you'll find an asset_id key that you could use in your application to tie the Asset back to the upload, but that gets tricky if your application misses events or they come out of order. To keep things simple, we like to use the passthrough key when creating an Asset. Let's look at how the passthrough workflow would work in a real application.

Creating an /upload route in the application

In the route we build to create and return a new Direct Upload, we'll first create a new object in our application that includes a generated ID and all the additional information we want about that Asset. Then we'll create the Direct Upload and include that generated ID in the passthrough field.

const { json, send } = require('micro');
const uuid = require('uuid/v1');

// This assumes you have MUX_TOKEN_ID and MUX_TOKEN_SECRET 
// environment variables.
const { Video } = new Mux();

// All the `db` references here are going to be total pseudocode.
const db = yourDatabase();

module.exports = async (req, res) => {
  const id = uuid();
  // Go ahead and grab any info you want from the request body.
  const assetInfo = await json(req);
  // Create a new upload using the Mux SDK.
  const upload = await Video.Uploads.create({
    // Set the CORS origin to your application.
    cors_origin: '',

    // Specify the settings used to create the new Asset after
    // the upload is complete
    new_asset_settings: {
      passthrough: id,
      playback_policy: 'public',

  db.put(id, {
    // save the upload ID in case we need to update this based on
    // `video.upload` webhook events.
    metadata: assetInfo,
    status: 'waiting_for_upload',
   // Now send back that ID and the upload URL so the client can use it!
  send(res, 201, { id, url: upload.url });

Excellent! Now we've got a working endpoint to create new Mux uploads that we can use in our Node app or deploy as a serverless function. Next we need to make sure we have an endpoint that handles the Mux webhooks when they come back.

const { json, send } = require('micro');

// More db pseudocode.
const db = yourDatabase();

module.exports = async (req, res) => {
  // We'll grab the request body again, this time grabbing the event
  // type and event data so we can easily use it.
  const { type: eventType, data: eventData } = await json(req);

  switch (eventType) {
    case 'video.asset.created': {
      // This means an Asset was successfully created! We'll get
      // the existing item from the DB first, then update it with the 
      // new Asset details
      const item = await db.get(eventData.passthrough);
      // Just in case the events got here out of order, make sure the
      // asset isn't already set to ready before blindly updating it!
      if (item.asset.status !== 'ready') {
        await db.put(, {
          asset: eventData,
    case 'video.asset.ready': {
      // This means an Asset was successfully created! This is the final
      // state of an Asset in this stage of its lifecycle, so we don't need
      // to check anything first.
	  	const item = await db.get(eventData.passthrough);
      await db.put(, { 
        asset: eventData,
    case 'video.upload.cancelled': {
      // This fires when you decide you want to cancel an upload, so you
      // may want to update your internal state to reflect that it's no longer
      // active.
      const item = await db.findByUploadId(eventData.passthrough);
      await db.put(, { ...item, status: 'cancelled_upload' });
      // Mux sends webhooks for *lots* of things, but we'll ignore those for now
      console.log('some other event!', eventType, eventData);

  // Now send back that ID and the upload URL so the client can use it!
  send(res, 200, 'Thanks for the webhook, Mux!');

Great! Now we've got our application listening for events from Mux, then updating our DB to reflect the relevant changes. You could also do cool things in the webhook handler like send your customers events via Server-Sent Events or WebSockets.

Handling really large files

In general, just making a PUT request with the file in the body is going to work fine for most client applications and content. When the files start getting a little bigger, you can stretch that by making sure to stream the file from the disk into the request. With a reliable connection, that can take you to gigabytes worth of video, but if that request fails, you or your customer are going to have to start the whole thing over again.

In those scenarios where you have really big files and potentially need to pause/restart a transfer, you can chunk up the file and use the resumable features of the upload endpoint! If you're doing it in a browser we wrote UpChunk to help, but the process isn't nearly as scary as it sounds.

  • Split the file into chunks that are a multiple of 256KB (256 1024 bytes). For example, if you wanted to have 20MB chunks, you'd want each one to be 20,971,520 bytes (20 1024 * 1024). The exception is the final chunk, which can just be the remainder of the file. Bigger chunks will be a faster upload, but think about each one as its own upload in the sense of needing to restart that one if it fails, but needing to upload fewer chunks can be faster.
  • Set a couple of headers:
    • Content-Length: the size of the current chunk you're uploading.
    • Content-Range: what bytes you're currently uploading. For example, if you've got a 1,000,000 byte file and you're uploading in ~10MB chunks, this header would look like Content-Range: bytes 0-10485760/1000000 for the first chunk.
  • Now use a PUT request like we were for "normal" uploads, just with those additional headers and each individual chunk as the body.
  • If the server responds with a 308, you're good to continue uploading! It will respond with as 200 OK or 201 Created when the upload is completed.

If that sounds daunting, just reach out! If we don't have a library we can point you to for your client environment, we can help you write one.