ReplayKit Implementation Notes - RareSloth Games

ReplayKit Implementation Notes

I spent two days digging into ReplayKit’s simple API to discover undocumented subtleties that caused me to write more code. There is very little documentation on ReplayKit aside from the WWDC video’s example code, so it can be hard to find any good information. This is a guide of what to look out for when doing your ReplayKit implementation.

I’m on XCode 8.0 and my devices are an iPad Mini 2 and an iPhone SE, both running iOS 10.0.2. We support iOS 8.0 and up in King Rabbit so we had to do iOS version checks to make sure none of this code was called if ReplayKit (iOS 9.0+) wasn’t available. This means weak linking (making Optional) the ReplayKit framework in the project. We tested some of these issues out on other App Store games and they seemed to have similar failures/stuck points.

Edit: Here’s a gist of how we ask for permissions, start, and stop recording/broadcasting in King Rabbit. However, iOS 11 appears to have caused some flakiness/bugs in the way we record. iOS 11 also allows recording/broadcasting from the Control Center instead of doing it through apps.

Edit 2: Here’s a swift example written by victorchee that may be helpful:


If you’re looking to record video on the device, you’ll use the instance of [RPScreenRecorder sharedRecorder] to call either startRecordingWithMicrophoneEnabled:handler: (deprecated) or startRecordingWithHandler: (iOS 10+).


If you want to support iOS 9.0, you’ll have to use this deprecated method to start your recording.

Microphone recording

If you deny access for microphone recording (choosing ‘Record screen only’) in the permissions alert view when starting a recording, you can’t turn on the microphone until starting a new recording. Just keep this in mind for your UI, since your “mic” button will be useless.


If you want the front facing camera preview as most streamers have in their broadcasts, you’ll want to look at the isCameraEnabled and isMicrophoneEnabled flags on RPScreenRecorder. I found that I had to set these flags to YES on the [RPScreenRecorder sharedRecorder] instance before calling startRecordingWithHandler:.

Then I found that you have to request video permissions before the flags you just set are honored. Otherwise, the flag will be unset when the handler argument is called. Also, if you try asking for permissions after you’ve already started recording and you try to setCameraEnabled to YES when granted, it’ll stop the recording without letting you know!

Do something like this:

[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {
    dispatch_async(dispatch_get_main_queue(), ^{
        if (granted)
			[[RPScreenRecorder sharedRecorder] setCameraEnabled:YES];
			[[RPScreenRecorder sharedRecorder] setMicrophoneEnabled:YES];
			[[RPScreenRecorder sharedRecorder] startRecordingWithHandler:^{
				// Recording started
Remove CFBundleDevelopmentRegion

From this simple use, I found that the handler wasn’t getting called on my iOS 10 devices (apparently is works fine on iOS 9). I found a tip on the Apple developer forum where someone removed the CFBundleDevelopmentRegion key from their Info.plist so I tried the same thing. And it worked! I don’t know how that could possibly tie in with ReplayKit, but there’s a bug in there somewhere.

Broadcasting (iOS 10+)

Broadcasting allows you to stream your recording directly to a third party service that you have installed on your device. I used Mobcrush and Periscope for testing.

To start a broadcast, you call:

[RPBroadcastActivityViewController loadBroadcastActivityViewControllerWithHandler:^(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error) {

Presenting the broadcastActivityViewController allows the player to choose which service he wants to stream with. An important thing to note – when presenting this view controller on an iPad, you have to present it in a popover, otherwise it just won’t do anything. There’s no documentation and no obvious logs that let you know this! You’ll need something like:

broadcastActivityViewController.modalPresentationStyle = UIModalPresentationPopover;
broadcastActivityViewController.popoverPresentationController.sourceView = myView;
broadcastActivityViewController.popoverPresentationController.sourceRect = myView.frame;
broadcastActivityViewController.delegate = self;
[myViewController presentViewController:broadcastActivityViewController animated:YES completion:nil];

The delegate method

- (void)broadcastActivityViewController:(RPBroadcastActivityViewController *)broadcastActivityViewController didFinishWithBroadcastController:(RPBroadcastController *)broadcastController error:(NSError *)error

gives you an RPBroadcastController instance, which you call startBroadcastWithHandler: on to start up the broadcast.

Broadcasting simply did not work on my iPad Mini 2 until I used the CFBundleDevelopmentRegion key hack from above. However, even with that, the broadcast had unusable quality. We’re limiting our broadcasting to only newer devices to try to mitigate these failures. When it fails to broadcast in this manner, it won’t call the handler with an error – it just won’t call it so you’ll be stuck in a loading screen if you have one.

If you want to enable the microphone and/or camera during a broadcast, you’ll have to figure it out. It doesn’t seem like RPScreenRecorder should be dealt with when you’re broadcasting, but it might provide the cameraPreviewView for you. You’ll also have to ask for mic/camera permissions yourself since initiating a broadcast doesn’t ask by default.

Some other notes

We tried starting a recording after finishing a broadcasting, but found that the handler wouldn’t be called on the startRecording methods. The same issue occurs in the opposite direction, finishing a recording and trying to start a broadcast. Just keep this in mind if you want to allow players to do either. We’re simply going to hide the other option for the lifetime of the running app so players don’t get stuck. Apple have apparently fixed this bug, but I haven’t had a chance to test it.

If you’re a third party streaming service that wants to implement the broadcasting extension, good luck. There are some classes in the ReplayKit documentation, but you’ll find no written documentation on what they do.

There might be some other gotchas I’ve forgotten, but hopefully this guide will help get you started more quickly than we did!

16 Comments on “ReplayKit Implementation Notes

  1. Thanks for writing down your experience with replayKit. The documentation is pretty sparse, so i’ll take anything I can get 🙂

    • I’m glad the information was useful! Apple fixed one of the bugs I reported so I’ll have to update this soon.


  2. Those iPad details helped, thanks! btw, for those setting the sourceView to their main screen view like I did (we’re not generally using UIKit in our app), use view.bounds + instead, otherwise you won’t see it.

  3. Your AVCaptureDevice code came in handy as well. However,
    [[RPScreenRecorder sharedRecorder] cameraPreviewView]
    is nil for me inside startRecordingWithHandler{}. The first time I call it, anyways. If I call it again (toggle recording off/on again) soon after, cameraPreviewView is valid and I can display it. Although even then the view only seems to update for a split second and then my camera image is frozen. Any ideas? Thanks!

    • Sorry for the way late response, but I was getting the cameraPreviewView after observing a notification posted in the handler block and never had issues with it being nil. Make sure to check for an error in the handler as well.

  4. Hey just curious! I know the post was from a while ago, but is it possible to use the initial camera permissions given by the user to use replaykit without asking for a permission again.

    • I don’t believe it is possible. Plus, if the user wanted to change how they were recording (like adding the mic) then you’d have to do extra work to make that happen. Asking for permissions every time isn’t a bad idea.

  5. hey Austin,
    i am trying to use broadcast function.
    i have created a broadcastActivityViewController controller and added an broadsetupUI extension.

    – (IBAction)onLiveButtonPressed:(id)sender {
    [RPBroadcastActivityViewController loadBroadcastActivityViewControllerWithHandler:^(RPBroadcastActivityViewController * _Nullable broadcastActivityViewController, NSError * _Nullable error) {
    broadcastActivityViewController.modalPresentationStyle = UIModalPresentationPopover;
    broadcastActivityViewController.popoverPresentationController.sourceView = self.view;
    broadcastActivityViewController.popoverPresentationController.sourceRect = self.view.frame;
    broadcastActivityViewController.delegate = self;
    [self presentViewController:broadcastActivityViewController animated:YES completion:nil];


    #pragma mark – RPBroadcastActivityViewControllerDelegate

    – (void)broadcastActivityViewController:(RPBroadcastActivityViewController *)broadcastActivityViewController didFinishWithBroadcastController:(nullable RPBroadcastController *)broadcastController error:(nullable NSError *)error {

    [broadcastActivityViewController dismissViewControllerAnimated:YES completion:nil];

    self.broadcastController = broadcastController;
    [broadcastController startBroadcastWithHandler:^(NSError * _Nullable error1) {
    if (!error1) {
    self.liveButton.selected = YES;
    } else {
    NSLog(@”startBroadcastWithHandler error: %@”, error);
    I am getting following error:
    Error Code=-5801 “The user declined application recording” UserInfo={NSLocalizedDescription=The user declined application recording}

    can you please help me, why this error is coming.

  6. Thanks for posting this! Did you come up with a strategy for handling the microphone permission and the UI? For example, we have a mic on/off button, and if the user says “record screen only” without the mic permission, I can’t see any way to find this out. Asking the rpScreenRecorder if isMicEnabled still returns whatever value I passed to it. I wish the permission were handed like other parts of iOS with callbacks / handlers so we could make decent UI.

    • I think the assumption I used was if the camera was available, that means the microphone is available. So if the user selects “record screen only”, the camera won’t be available, therefore the microphone isn’t available. I just posted the code I used for recording/broadcasting as a gist, check it out and maybe you’ll find something in there that’s helpful:

    • Thanks for the resource, I’ll add a note in post about it

Leave a Reply

Your email address will not be published.