Allow your users to upload content directly to Mux.
Direct Uploads allow you to provide an authenticated upload URL to your client applications so content can be uploaded directly to Mux without needing any intermediary steps. You still get to control who gets an authenticated URL, how long it's viable, and, of course, the Asset settings used when the upload is complete.
The most common use-case for Direct Uploads is in client applications, such as native mobile apps and the browser, but you could also use them to upload directly from your server or in a command line tool. Any time you don't feel the need to store the original on your own, just generate a signed URL and push the content directly.
Let's start by walking through the simplest use case of getting a file directly into Mux.
The first step is creating a new Direct Upload with the Mux Asset settings you want. The Mux API will return an authenticated URL that you can use directly in your client apps, as well as an ID specific to that Direct Upload so you can check the status later via the API.
curl https://api.mux.com/video/v1/uploads \
-X POST \
-H "Content-Type: application/json" \
-u MUX_TOKEN_ID:MUX_TOKEN_SECRET \
-d '{ "new_asset_settings": { "playback_policy": ["public"], "video_quality": "basic" }, "cors_origin": "*" }'
Once you've got an upload object, you'll use the authenticated URL it includes to make a PUT
request that includes the file in the body. The URL is resumable, which means if it's a really large file you can send your file in pieces and pause/resume at will.
async function uploadVideo () {
// videoUri here is the local URI to the video file on the device
// this can be obtained with an ImagePicker library like expo-image-picker
const imageResponse = await fetch(videoUri)
const blob = await imageResponse.blob()
// Create an authenticated Mux URL
// this request should hit your backend and return a "url" in the
// response body
const uploadResponse = await fetch('/backend-api')
const uploadUrl = (await uploadResponse.json()).url
try {
let res = await fetch(uploadUrl, {
method: 'PUT',
body: blob,
headers: { "content-type": blob.type}
});
console.log("Upload is complete");
} catch(error) {
console.error(error);
}
};
If you were following along with these examples, you should find new Assets in the Mux Dashboard with the settings you specified in the original upload create request, but the video you uploaded in the second step!
If the upload doesn't work via cURL, be sure that you've put quotes around the upload URL.
The examples above are a great way to upload a one-off file into Mux, but let's talk about how this workflow looks in your actual application. Typically you're going to want to do a few things:
Just like Assets, Direct Uploads have their own events, and then the Asset created off the upload has the usual events as well. When you receive the video.upload.asset_created
event you'll find an asset_id
key that you could use in your application to tie the Asset back to the upload, but that gets tricky if your application misses events or they come out of order. To keep things simple, we like to use the passthrough
key when creating an Asset. Let's look at how the passthrough workflow would work in a real application.
Upload reliably with our Upload SDKs
We provide SDKs for Android, iOS, iPadOS, and web frontend that handle difficult parts of the upload process, such has handling large files and preprocessing video for size and cost. Once your backend has created an authenticated URL for the upload, you you can give it to one of our Upload SDKs to reliably process and upload the the video.
For more information, check out our upload SDK guides:
Next.js React example
with-mux-video is a full open-source example application that uses direct uploads
npx create-next-app --example with-mux-video with-mux-video-app
Another open-source example application is stream.new. GitHub repo link: muxinc/stream.new
git clone git@github.com:muxinc/stream.new.git
Both of these example applications use Next.js, UpChunk, Mux Direct Uploads and Mux playback.
/upload
route in the applicationIn the route we build to create and return a new Direct Upload, we'll first create a new object in our application that includes a generated ID and all the additional information we want about that Asset. Then we'll create the Direct Upload and include that generated ID in the passthrough
field.
const { json, send } = require('micro');
const uuid = require('uuid/v1');
// This assumes you have MUX_TOKEN_ID and MUX_TOKEN_SECRET
// environment variables.
const mux = new Mux();
// All the 'db' references here are going to be total pseudocode.
const db = yourDatabase();
module.exports = async (req, res) => {
const id = uuid();
// Go ahead and grab any info you want from the request body.
const assetInfo = await json(req);
// Create a new upload using the Mux SDK.
const upload = await mux.video.uploads.create({
// Set the CORS origin to your application.
cors_origin: 'https://your-app.com',
// Specify the settings used to create the new Asset after
// the upload is complete
new_asset_settings: {
passthrough: id,
playback_policy: ['public'],
video_quality: 'basic'
}
});
db.put(id, {
// save the upload ID in case we need to update this based on
// 'video.upload' webhook events.
uploadId: upload.id,
metadata: assetInfo,
status: 'waiting_for_upload',
});
// Now send back that ID and the upload URL so the client can use it!
send(res, 201, { id, url: upload.url });
}
Excellent! Now we've got a working endpoint to create new Mux uploads that we can use in our Node app or deploy as a serverless function. Next we need to make sure we have an endpoint that handles the Mux webhooks when they come back.
const { json, send } = require('micro');
// More db pseudocode.
const db = yourDatabase();
module.exports = async (req, res) => {
// We'll grab the request body again, this time grabbing the event
// type and event data so we can easily use it.
const { type: eventType, data: eventData } = await json(req);
switch (eventType) {
case 'video.asset.created': {
// This means an Asset was successfully created! We'll get
// the existing item from the DB first, then update it with the
// new Asset details
const item = await db.get(eventData.passthrough);
// Just in case the events got here out of order, make sure the
// asset isn't already set to ready before blindly updating it!
if (item.asset.status !== 'ready') {
await db.put(item.id, {
...item,
asset: eventData,
});
}
break;
};
case 'video.asset.ready': {
// This means an Asset was successfully created! This is the final
// state of an Asset in this stage of its lifecycle, so we don't need
// to check anything first.
const item = await db.get(eventData.passthrough);
await db.put(item.id, {
...item,
asset: eventData,
});
break;
};
case 'video.upload.cancelled': {
// This fires when you decide you want to cancel an upload, so you
// may want to update your internal state to reflect that it's no longer
// active.
const item = await db.findByUploadId(eventData.passthrough);
await db.put(item.id, { ...item, status: 'cancelled_upload' });
}
default:
// Mux sends webhooks for *lots* of things, but we'll ignore those for now
console.log('some other event!', eventType, eventData);
}
}
Great! Now we've got our application listening for events from Mux, then updating our DB to reflect the relevant changes. You could also do cool things in the webhook handler like send your customers events via Server-Sent Events or WebSockets.
In general, just making a PUT
request with the file in the body is going to work fine for most client applications and content. When the files start getting a little bigger, you can stretch that by making sure to stream the file from the disk into the request. With a reliable connection, that can take you to gigabytes worth of video, but if that request fails, you or your customer are going to have to start the whole thing over again.
In those scenarios where you have really big files and potentially need to pause/restart a transfer, you can chunk up the file and use the resumable features of the upload endpoint! If you're doing it in a browser we wrote UpChunk to help, but the process isn't nearly as scary as it sounds.
With NPM
npm install --save @mux/upchunk
With yarn
yarn add @mux/upchunk
With CDN
<script src="https://cdn.jsdelivr.net/npm/@mux/upchunk@2"></script>
import * as UpChunk from '@mux/upchunk';
// Pretend you have an HTML page with an input like: <input id="picker" type="file" />
const picker = document.getElementById('picker');
picker.onchange = () => {
const getUploadUrl = () =>
fetch('/the-backend-endpoint').then(res => {
res.ok ? res.text() : throw new Error('Error getting an upload URL :(')
});
const upload = UpChunk.createUpload({
endpoint: getUploadUrl,
file: picker.files[0],
chunkSize: 5120, // Uploads the file in ~5mb chunks
});
// subscribe to events
upload.on('error', err => {
console.error('💥 🙀', err.detail);
});
}
256 * 1024
bytes). For example, if you wanted to have 20MB chunks, you'd want each one to be 20,971,520 bytes (20 * 1024 * 1024
). The exception is the final chunk, which can just be the remainder of the file. Bigger chunks will be a faster upload, but think about each one as its own upload in the sense of needing to restart that one if it fails, but needing to upload fewer chunks can be faster.Content-Length
: the size of the current chunk you're uploading.Content-Range
: what bytes you're currently uploading. For example, if you've got a 10,000,000 byte file and you're uploading in ~1MB chunks, this header would look like Content-Range: bytes 0-1048575/10000000
for the first chunk.PUT
request like we were for "normal" uploads, just with those additional headers and each individual chunk as the body.308
, you're good to continue uploading! It will respond with as 200 OK
or 201 Created
when the upload is completed.If that sounds daunting, just reach out! If we don't have a library we can point you to for your client environment, we can help you write one.