Skip to Content
Mux Docs: Home

Best practices for Real-Time Video in Android applications

Details of best practices which will enable your applications to function better and be more maintainable.

1Permissions handling

Permissions handling proves to be a challenging aspect of integrating real-time video into an Android application. You should only ask for permission for features you know your application needs.

  • If you are only acting as a subscriber then you can eliminate asking for permissions relating to video and audio altogether.
  • If you need access to the camera and microphone you will need to follow the steps regarding requesting permissions in the [guide].
  • If you need to share the screen you will need to use the different mechanism detailed in the screen sharing guide.

You can freely combine them in the same Activity.

2Activity lifecycle

Handling an incoming phone call and screen rotation

The SDK will not automatically stop you from being in a Space if a phone call occurs. You will need to consider what is right for your application and handle it appropriately.

For most applications the correct pattern is to have joining a Space be a whole separate Activity and override onStop so that it leaves the Space and terminates the Activity (returning to the previous Activity) in the event of a phone call.

An example onStop would be something like:

@Override
protected void onStop() {
    if (space != null) {
        space.leave(this, isChangingConfigurations());
        space = null;
    }

    if (!isChangingConfigurations()) {
        finish();
    }
    super.onStop();
}

Note the call to super.onStop is at the end so we don't return control to the operating system until after completing the work. space.leave performs any necessary clean up work before returning.

In the example above there are a couple of mentions of isChangingConfigurations. This is an Activity method that returns if the current lifecycle event is the result of a configuration change (orientation, keyboard etc.) or something more fundamental. In the event it's not a configuration change it is appropriate for us to terminate.

The boolean passed as the second parameter to space.leave is treated as a hint as to if the Space is to be kept alive. In the event of a configuration change it is assumed that shortly after leaving we will want to rejoin the exact same Space, and we don't want to interrupt the media streams going in/out or disconnect the network. This way we can handle rotations etc. without dropping access to the actual Space. The Space instance is kept in the Application and will survive interruptions to individual Activity instances.

3Audio options

By default, the audio output of a Space is treated the same as a phone call to the Android OS, and streams of that type cannot be muted by using the volume controls that the user is familiar with.

To work around this we provide a additional hook into the SDK to set global options such as the AudioAttributes to use for the output channel. In our sample video conferencing application this ends up looking like:

try {
    Spaces.SdkOptions.Builder options = new Spaces.SdkOptions.Builder();

    options.setAudioSource(MediaRecorder.AudioSource.UNPROCESSED);
    options.setAttemptDisableLegacyAudioProcessing(true);

    options.setAudioAttributes(new AudioAttributes.Builder()
            .setLegacyStreamType(AudioManager.STREAM_MUSIC)
            .setUsage(AudioAttributes.USAGE_MEDIA)
            .setContentType(AudioAttributes.CONTENT_TYPE_UNKNOWN)
            .build());

    options.setUseHardwareAcousticEchoCanceler(true);
    options.setUseHardwareNoiseSuppressor(true);

    Spaces.setSdkOptions(options.build());
} catch (Exception e) {
    e.printStackTrace();
}

The try/catch is used because not all devices will support these options and setting them may fail. In this case we set the stream to play back as if it were music, which makes the track appear to the system under the media volume control section, which can be completely muted by the user as they would expect.

If you wish for the audio to be treated like a voice call it is best to call setVolumeControlStream in your Activity when joining a Space:

setVolumeControlStream(AudioManager.STREAM_VOICE_CALL);

By doing this the user volume controls will control the right audio stream by default in your Activity, however, it will not be able to be muted.

None of this mentioned Audio Focus, which is a whole subject by itself. At this time the SDK makes no attempt to handle audio focus for you, and if you wish to do so you should handle it yourself.

4Screen sharing

It's strongly recommended that if you are doing screen sharing you work through the screen sharing guide as it calls out many of the problems.

Handling navigating away from the application but still sharing the screen

When you start development, you tend to be focused on if you can join a Space and share video in it. This may lead you to a situation where you think this works and then you break it because you finally implemented leaving (for example in onPause) and find this prevents you being able to screen share other applications.

We have found that the best solution is the onStop implementation above but extended with an awareness of screen sharing:

@Override
protected void onStop() {
    if(space != null && space.getLocalParticipant().getScreenCaptureTrack().isPublished()) {
        // Don't do anything!
        // We want to keep the app running with the space connected when in the background
    } else {
        if (space != null) {
            space.leave(spaceListener, isChangingConfigurations());
            space = null;
        }

        if (!isChangingConfigurations()) {
            finish();
        }
    }

    super.onStop();
}

Handling returning to the application while sharing the screen from the notification

The recommendation is to use the "singleTop" launchMode for the Activity (specified in the manifest), and then creating a PendingIntent to return to the Activity from the Notification as described in the screensharing guide.

AndroidManifest.xml:

...
<activity
    android:launchMode="singleTop"
    android:name=".MainActivity"
    android:exported="true">
    <intent-filter>
        <action android:name="android.intent.action.MAIN" />
        <category android:name="android.intent.category.LAUNCHER" />
    </intent-filter>
</activity>
...

Where you build your notification:

// We can customize the notification the SDK displays while screen sharing
// This serves two important purposes:
// 1. It refers to this specific application as doing the screen sharing
// 2. We create a PendingIntent which returns to this Activity instance when selecting the notification so the user can get
// back to a screen where they can stop the screen sharing
NotificationManager nm = (NotificationManager) getSystemService(Context.NOTIFICATION_SERVICE);
Notification.Builder builder = null;

if (Build.VERSION.SDK_INT >= Build.VERSION_CODES.O) {
    NotificationChannel channel = new NotificationChannel(
            "screen_capture_channel_id",
            "Screen Capture",
            NotificationManager.IMPORTANCE_LOW
    );

    nm.createNotificationChannel(channel);
    builder = new Notification.Builder(this, channel.getId());
} else {
    builder = new Notification.Builder(this);
}

// For this to work the launchMode of MainActivity must be set to "singleTop" in AndroidManifest.xml
// This ensures Activity instances are unique and so the notification won't do things like launch a new activity
// If you have more complex use cases which require things like preserving a large back stack you will need to use one of the
// normal approaches for doing that, but it will be highly specific to your application.
Intent intent = new Intent(this, MainActivity.class);
PendingIntent pendingIntent = PendingIntent.getActivity(this, 1, intent, PendingIntent.FLAG_MUTABLE);

// The setSmallIcon is necessary or the operating system will silently disregard the notification, and so the whole screen sharing request
// setOngoing ensures the user can't remove the notification while screen sharing
builder.setPriority(Notification.PRIORITY_DEFAULT)
        .setContentIntent(pendingIntent)
        .setSmallIcon(R.mipmap.ic_launcher)
        .setOngoing(true)
        .setContentTitle("Mux Spaces SDK Screen sharing example")
        .setContentText("Your screen is being shared");

Notification notification = builder.build();

spaces = Spaces.getInstance(this);
spaces.setScreenShareNotification(notification);

5Using the Mux Spaces SDK data model

Use the Parcelable SpaceConfiguration

SpaceConfiguration is Parcelable which provides a nice way to configure Space instances in one Activity and then launch them in another, especially when you want to retrieve a JWT from the server and pass it on for use. For example a LoginActivity could include code like this on completion of the request:

SpaceConfiguration spaceConfiguration = null;

try {
    spaceConfiguration = SpaceConfiguration.newBuilder()
        // Build my configuration with my downloaded JWT
        .build();
} catch (Exception e) {
    // Handle the failure
}

Intent intent = new Intent();
intent.setClass(LoginActivity.this, MeetActivity.class);
intent.addFlags(Intent.FLAG_ACTIVITY_NEW_TASK);
intent.putExtra(MeetActivity.EXTRA_SPACE_CONFIGURATION, spaceConfiguration);

startActivity(intent);

Then the corresponding MeetActivity could handle it:

SpaceConfiguration spaceConfiguration;

  @Override
  protected void onCreate(Bundle savedInstanceState) {
    ...
    spaceConfiguration = getIntent().getParcelableExtra(EXTRA_SPACE_CONFIGURATION);
    ...
  }

  ...
}

Rather than trying to join things on IDs let the SDK do it

The SDK goes to surprising lengths to present a consistent view of the world to the application main thread, while actually being multithreaded and consequently slightly out of sync with it.

A Space is unique per SpaceConfiguration. If the SpaceConfiguration is the same the instance of the Space is the same, if the SpaceConfiguration is different so is the Space.

Within a Space all Participants and Tracks are unique and comparable. i.e. you can use == to compare them as references without problems with the expected results.

You will always get the same LocalParticipant instance throughout the lifecycle of a Space. The ID field of the LocalParticipant may initially not be set until the application has joined the Space. For this reason it is strongly recommended that if you need to keep a mapping of participants you use the actual Participant class as the key type and not any identifier. For example HashMap<Participant, String> ourLocalNamesForPeople;. That way you will not have to deal with the resulting difficulties.

RemoteParticipants are always unique per ID, but unlike a LocalParticipant that ID does not change. You will always receive the same RemoteParticipant object for a given ID.

You can safely access a HashMap of all Participants, mapped by their ID, from the Space with space.getParticipants(). Note that this contains all the Participants active in the Space since you joined, even those that have left.

Tracks are similar. The same Track object is retrieved throughout the Space lifecycle for a given track ID. People familiar with WebRTC internals would be curious about that - in fact Tracks contain the MediaStreamTrack which is used internally, and will create new MediaStreamTracks when necessary on demand without the SDK consumer needing to worry about it. This is one of the major conveniences the SDK provides.

6Rate limits

Today both custom event publishing and display name updates are rate limited, both in the SDK and at the server. If you exceed the rate limits then the operation will fail and you will receive an error. It's best to avoid this scenario altogether and ensure you do not trigger the rate limits.

There is an old school Android idiom that is very suitable for this sort of problem: Handlers and Messages. We can use a Handler on our main thread to update the UI so that operations which are rate limited cannot be triggered by the user. This is very comparable to how a JavaScript developer may use setTimeout.

We need a few Activity level items defined:

// Value to control rate limit of events
// Decrement every time we send a customEvent
// Increment a second after sending an event
// Start at 1 (on join)
private int eventRateLimiter;

private static final int MSG_EVENT_RATE_LIMITER_INCREMENT = 1;
private Handler handler;

Then in our onCreate we create a Handler on the main thread and provide a Handler.Callback implementation for our Message type:

handler = new Handler(new Handler.Callback() {
    @Override
    public boolean handleMessage(@NonNull Message message) {
      switch (message.what) {
          case MSG_EVENT_RATE_LIMITER_INCREMENT:
          eventRateLimiter++;
          updateUI();
          return true;
      }
      return false;
    }
});

In our onJoined we set the initial state:

eventRateLimiter = 1;

Our sending of the custom event (from a UI event listener) looks like:

if(joined && eventRateLimiter > 0 && !text.getText().toString().trim().equals("")) {
    ...
    space.getLocalParticipant().publishCustomEvent(event.toString());

    eventRateLimiter--;
    handler.sendEmptyMessageDelayed(MSG_EVENT_RATE_LIMITER_INCREMENT, 1000);
    updateUI();
    ...
}

We call this one method every time we want to send a custom event so all pass through the same gate. We use sendEmptyMessageDelayed because it creates less garbage than creating and posting Runnables.

And finally the updateUI that is mentioned everywhere looks like this:

if(joined && eventRateLimiter > 0 && !text.getText().toString().trim().equals("")) {
    chatSendButton.setAlpha(1.0f);
} else {
    chatSendButton.setAlpha(0.3f);
}

This is for chat in our video conferencing application, and fades out the button to indicate to the user that the send operation is not available, either because we're not in the Space, the rate limiter is in play or if they haven't entered any text which might be sent since we do not send empty messages.

Since all of these operations occur on the UI thread we do not have to worry about concurrent access or synchronization. We're also not doing very much at all so it's not going to impact performance.

7Error handling

The SDK behaves almost entirely in a non-blocking manner from the point of view of the application developer. (The big exception to this rule is if you call Space.leave from the UI thread it will block until any resources which could have been released by the operation have been). Operations are essentially queued up and handled as quickly as possible. One side effect of this is that the typical Java Exception model is inappropriate. Instead you will receive errors via callbacks, unsurprisingly most of the time via Space.Listener.onError.

Errors are subclasses of MuxError, and have Enums which indicate what they are. For example to detect and handle when screen sharing permission is refused by the user:

if (error.getValue() == LocalParticipant.Errors.ScreensharePermissionDenied) {
    // Screen sharing was denied
}

The error codes are defined on the classes which would cause them.

Some errors are fatal. You can find out if the error you received is fatal with error.isFatal(). If an error is fatal after you have received the error the SDK will leave the Space as such errors are not recoverable.

8Suggestions

For any questions or suggestions, email us at real-time-video@mux.com. Thanks and happy coding!

Was this page helpful?