Skip to Content
Mux Docs: Home

Mux Real-Time Video has been sunset and is unavailable for new usage. Existing access will end on December 31, 2023. We recommend migrating your application to our partner, LiveKit. Please reach out to if you need more help or details.

Send and receive real-time video from an Android application

This guide contains instructions for setting up the Mux Spaces Android SDK. By the end of the guide you'll have a working app that will be able to connect to a Space, as well as send and receive media from the Space's participants.

1Understand core abstractions


A Space is the basic abstraction for creating real-time communications with Mux. In order for clients to authenticate to a space, they need to provide a signed JSON Web Token, or JWT. For more information about Signing JWTs, refer to this guide.


A participant is an abstraction of a single user in a space. A participant can be a subscriber-only, or a publisher who sends one or more streams of audio or video media in the form of a track.


A track is a single stream of media (audio or video). A participant can publish one or more tracks of media.

Creating a space

A space must be created either through the Mux dashboard or via the Mux API. See the Create a space section of the Real-Time Video guide for more details about creating a space.

If you already have Mux Access Tokens setup and just want to create a space from the terminal, use this command.

curl \
-H "Content-Type: application/json" \

Authenticating into a space

To join a space from the Mux Real-Time Android SDK, you must sign a JWT using your signing key-pair. See the Sign a JWT section of the Real-Time Video guide.

Prerequisites for this example

In order to complete this example you should have experience with Android development, Android development tools (Android Studio, Gradle etc.) and a device to test on.


Here you can view the API documentation as Javadoc.

2Application setup

Create the Android project:

  1. In Android Studio select File → New → New Project
  2. Select “Empty Activity”
  3. Fill out the Name and Package Name as you want
  4. The Minimum SDK must be at least 28 (Android 9)
  5. Our example will be in Java, but the SDK is still easy to use from Kotlin

Configure Maven imports and add a dependency on the MuxSpaces SDK:

  1. Our SDK is hosted in the Mux release Maven repository. We also include libwebrtc which is hosted on jitpack. JavaDoc is included so you can explore the classes and methods in Android Studio.
  2. In the project settings.gradle add the following two items to dependencyResolutionManagementrepositories (typically under google() and mavenCentral()):
    maven { url '' }
    maven { url '' }
  3. Add the dependency to your app/build.gradle in the dependencies section of the app module build.gradle script:
    implementation "com.mux.spaces:sdk:1.0.0"
  4. Add the dependency to your app/build.gradle in the dependencies section of the app module build.gradle script:
    implementation "androidx.activity:activity:1.5.1"
    only needed in the example for permissions purposes
  5. Sync Gradle to download the dependencies
  6. Make sure your activity is a subclass of “" (This is the default used for the purposes of the example - plain Activity subclasses can work).
  7. In your application manifest (AndroidManifest.xml) ensure android:allowBackup="false" in the application element, e.g.:

Create the layouts

  1. We’re going to need to display our local video and any remote video, so we’re going to modify the layout in layout/activity_main.xml .
  2. Our layout doesn’t need to be clever, so lets replace the default with a LinearLayout. See below for an example with a white background that makes for a good backdrop for what you'll add next:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android=""
  1. The Spaces Android SDK provides a view class which we will use for displaying local or remote video tracks com.mux.sdk.webrtc.spaces.views.TrackRendererSurfaceView. (Unsurprisingly this is a SurfaceView subclass). At this time audio tracks are automatically played and mixed when you are subscribed to them.
  2. Add an instance of that class to the LinearLayout:
android:layout_height="360dp" />

Build and run the app, you should seen an empty screen, with a space for your own video when it loads up.

3Join a space

To join a Space we will need a JSON Web Token (JWT) that is generated by a server. Check out the Authentication section in the main getting started guide for more details. We will use the JWT assemble this into a SpaceConfiguration in the MainActivity onCreate method (You will need to import SpaceConfiguration. If you have set the dependency correctly Android Studio should suggest the correct import).:

Replace "YOUR JWT" in this example with the JWT that you create server-side. In a production application your application should make a request to fetch the JWT from a server.

Next, create a class level variable for the Space, and when importing Space make sure to import "com.mux.sdk.webrtc.spaces.Space".

private Space space;

Inside the try/catch after building the configuration we have to obtain the Spaces singleton, which provides the core services for the SDK. (Rather like the services the browser provides to our web SDK).

Spaces spaces = Spaces.getInstance(this);

We then get Spaces to retrieve the Space in onCreate:

space = spaces.getSpace(spaceConfiguration);

If there is an existing instance of the same configuration that will be returned. Simply having an empty Space doesn’t do anything, you must join it, and for the Android SDK it is a requirement to add a Space.Listener when you join, so we will create a class level variable holding our Space.Listener:

private final Space.Listener spaceListener = new Space.Listener() {};

This is where all events relevant to the Space will occur, and we provide empty default implementations. All callbacks occur on the UI thread so you won’t have to worry about accessing UI components directly. (The SDK itself is multithreaded and should not ever block the UI thread. This is also why you receive callbacks on Space.Listener for Participant and Track events on Android: this way the events arrive in exact order and it’s harder to lose them when you’re doing something else)

To test this out, let's trigger a Toast when the Space is joined, and let's add an onError override within the Listener:

private final Space.Listener spaceListener = new Space.Listener() {
public void onJoined(Space space, LocalParticipant localParticipant) {
Toast.makeText(MainActivity.this, "Joined space "+space.getId()+" as "+localParticipant.getId(), Toast.LENGTH_LONG).show();
public void onError(Space space, MuxError muxError) {
Toast.makeText(MainActivity.this, "Error! "+muxError.toString(), Toast.LENGTH_LONG).show();

Lastly, join the space, add this line at the bottom of the try/catch in the onCreate function:


4Subscribe to participants

Now that our app is running and we're connected to a space, we can subscribe to remote participants. There are events in Space.Listener that deal with this:

  1. onParticipantTrackPublished: This will fire when a remote participant publishes a track.
  2. onParticipantTrackSubscribed: This will fire when we have chosen to actively subscribe to a track and our subscription has been accepted. As of now the SDK will automatically subscribe to all tracks that are added.

Update the layout to add views for the remote participants:

<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android=""
android:layout_height="180dp" />

We now need to create TrackRendererSurfaceViews and set their tracks as things appear, so let’s override onParticipantTrackSubscribed in our spaceListener.

public void onParticipantTrackSubscribed(Space space, Participant participant, Track track) {
// We can only add video type tracks to views or we'll get an IllegalArgumentException
if(track.trackType == Track.TrackType.Video) {
TrackRendererSurfaceView trackRendererSurfaceView = new TrackRendererSurfaceView(MainActivity.this);
// Evil sizing hard coding to keep things on point
trackRendererSurfaceView.setLayoutParams(new LinearLayout.LayoutParams(320, 240));
((ViewGroup) findViewById(;

At this point, you should be seeing the media of any remote participants who are publishing.

Note about audio: You will receive events when audio track subscriptions start and stop, however, right now all subscribed audio is automatically mixed and played without you needing to do anything.

Remove remote streams as they disconnect

Right now in your app, when remote participants disconnect you will be left with a frozen view. In order to handle that we will add a HashMap class member variable to keep track of the View associated with each Track:

private HashMap<Track, TrackRendererSurfaceView> remoteViews;

Create it in onCreate immediately after setting the content view:

remoteViews = new HashMap<>();

Now when we add views in onParticipantTrackSubscribed we also put them in the map immediately following trackRendererSurfaceView.setTrack(track):

remoteViews.put(track, trackRendererSurfaceView);

And now we can remove the view when the track unsubscribes:

public void onParticipantTrackUnsubscribed(Space space, Participant participant, Track track) {
TrackRendererSurfaceView view = remoteViews.get(track);
if(view != null) {
((ViewGroup) findViewById(;

5Publish audio and video

In addition to subscribing to remote participants, we can publish our audio and video to the space.

Before doing this we have to request permission for accessing the camera and recording audio, so we have to follow the patterns described in the Android developer documentation and add them to MainActivity:

The Spaces SDK doesn’t help you request permissions - this is handled by your own application as shown above. If you don’t you will get errors in Space.Listener.onError when the system attempts to access the camera or microphone.

The camera and microphone are accessed as Tracks on the LocalParticipant. (The camera can be switched between different cameras on the device by calling setCamera on the camera track).

When the user grants permission we can access the camera Track on the LocalParticipant after creating a Space but before joining it, such as to show a preview, so just before calling space.join(spaceListener); in joinSpace add:


Then in onJoined we can actually publish the tracks of the LocalParticipant camera and microphone:

public void onJoined(Space space, LocalParticipant localParticipant) {
Toast.makeText(MainActivity.this, "Joined space "+space.getId()+" as "+localParticipant.getId(), Toast.LENGTH_SHORT).show();

Hopefully if everything is working out, you should now have local and remote video displying correctly.

6Other considerations

Handle screen rotation

If you rotate the screen in this demo you'll notice bad things happen due to the Activity being restarted. We need to add a few bits of code to fix this by keeping the Space active during rotation:

In the beginning of the joinSpace method, add:

if(space != null) {

And then override MainActivity.onDestroy:

protected void onDestroy() {
if (space != null) {
space.leave(spaceListener, isChangingConfigurations());
space = null;

The isChangingConfigurations() call will pass in a boolean to space.leave. When the configuration is changing due to screen rotation the Spaces SDK will remain active in the background and calling space = spaces.getSpace(spaceConfiguration); will re-attach the Space and replay all the events to re-construct the state of the Space. The onJoin events will re-fire. In this way the Activity can get back to the correct state.

Selecting a different camera

Today the Mux Spaces SDK for Android supports only one camera at once, but you can change which camera is in use whenever the LocalParticipant for a Space is valid.

First you need to pick which camera you want to use, then set that in the SDK, and we’ll do this by inserting code in joinSpace just before this line where we set the track for the view:


Let’s assume we want to use a rear facing camera, if any exists. The Spaces SDK API accepts the name of cameras as returned by the libwebrtc Camera2Enumerator. (If your IDE doesn't automatically you will need to import org.webrtc.Camera2Enumerator).

Camera2Enumerator camera2Enumerator = new Camera2Enumerator(this);
String [] cameraNames = camera2Enumerator.getDeviceNames();

Then we need to iterate over them to find a rear facing camera:

String rearFacingCamera = null;
for(String camera: cameraNames) {
if (rearFacingCamera == null && camera2Enumerator.isBackFacing(camera)) {
rearFacingCamera = camera;

After that it’s a question of setting the camera by calling:

if(rearFacingCamera != null) {

You could also display a list of cameras to select and allow the user to pick the camera. If for whatever reason you attempt to select a camera that does not exist you will receive a LocalParticipant.Errors.InvalidCamera error.

Suppress Outgoing Video and Audio

Outgoing audio and video can be separately disabled by using the mute() methods on LocalAudioTrack and LocalVideoTrack. This silences outgoing audio and disables local camera output, respectively.

// Mute outgoing audio
// Mute outgoing video
// Un-Mute outgoing audio
// Un-Mute outgoing video

Was this page helpful?