Contact Support | System Status
Page Contents

    Code Snippets using the Native SDK for iOS

    In this topic, you will find a collection of code snippets which you may find useful as a reference while developing with the SDK. For more detailed solutions, see the iOS player samples.

    Table of contents

    Advertising

    Analytics

    Captions

    Content security (DRM)

    Cue points

    Playback

    Playlists

    Styling

    Custom controls for AirPlay

    If you use the Brightcove PlayerUI controls, you are all set. AirPlay functionality works out-of-the-box, allowing users to stream video to high-definition displays with Apple TV.

    If you are using custom controls, you can follow these steps:

    1. Learn about AirPlay in Apple's developer documentation for AirPlay.

    2. Use the playback controller’s allowsExternalPlayback property to set the AVPlayer’s allowsExternalPlayback property:

          _playbackController.allowsExternalPlayback = YES;
          playbackController.allowsExternalPlayback = true
    3. Set up an AirPlay router control and handle its selection. For details, see Apple's AirPlay Overview document.

    Reference

    For details, see the BCOVPlaybackController documentation.

    Customizing the closed captions button

    The BCOVPlayerUI sample code shows you how to customize the Brightcove player when using the Native SDK for iOS. For more details, see the Customizing PlayerUI controls section of the Native SDK reference document.

    To customize the closed captions button using the PlayerUI, follow these steps:

    1. Start with the Basic Video Playback App.

    2. Replace your values for the Policy Key, Account Id and Video Id. Select a video from your account that has text tracks.

    3. Set up the player view with a standard VOD layout:

      // Set up our player view. Create with a standard VOD layout.
      BCOVPUIPlayerView *playerView = [[BCOVPUIPlayerView alloc] 
      initWithPlaybackController:self.playbackController options:nil controlsView:
      [BCOVPUIBasicControlView basicControlViewWithVODLayout] ];
      // Set up our player view. Create with a standard VOD layout.
      guard let playerView = BCOVPUIPlayerView(playbackController: 
      self.playbackController, options: options, controlsView: 
      BCOVPUIBasicControlView.withVODLayout()) else {    
        return
      }
    4. The closedCaptionButton is declared as a BCOVPUIButton, which is a subclass of UIButton and adds three additional methods for customization. Whenever you customize BCOVPlayerUI controls, you should use the Native Player APIs wherever they are available. Your customized code should look similar to this:

      // Customize the CC button.
      BCOVPUIButton *ccButton = playerView.controlsView.closedCaptionButton;
      ccButton.titleLabel.font = [UIFont systemFontOfSize:14.];
      ccButton.primaryTitle = @"CC";
      [ccButton showPrimaryTitle:YES];
      if let ccButton = playerView.controlsView.closedCaptionButton {    
        ccButton.titleLabel?.font = UIFont.systemFont(ofSize: 14)    
        ccButton.primaryTitle = "CC"    
        ccButton.showPrimaryTitle(true)}
      }

    Displaying FairPlay content on an external screen

    When an external display is connected to an iOS device using an AV adapter and HDMI cable, the default behavior is to mirror the iOS screen. The exception to this is when you use FairPlay-protected video, which Apple prevents from mirroring (WWDC 2015, Session 502).

    To display FairPlay-protected videos, set the AVPlayer properties exposed through the Brightcove Playback Controller to allow FairPlay video to play on an external display. The video plays in full screen mode. Here is an example of setting these properties:

    playbackController.allowsExternalPlayback = YES;
    playbackController.usesExternalPlaybackWhileExternalScreenIsActive = YES;
    playbackController.allowsExternalPlayback = true
    playbackController.usesExternalPlaybackWhileExternalScreenIsActive = true

    Reference

    For details, see the BCOVPlaybackController documentation.

    Google Analytics

    If you use the Brightcove player and the catalog class, video analytics will be automatically collected and will appear in your Video Cloud Analytics module. For additional metrics, you can add Google Analytics to your app.

    To integrate Google Analytics with your app, follow these steps:

    1. Review Google's document to Add Analytics to Your iOS App.
    2. Here is one way you could use Google Analytics to track video playback using Google's Firebase SDK:

      // This snippet shows one way to track video playback
      // using the Firebase SDK from Google Analytics with
      // the lifecycle event playback controller delegate method.
      - (void)playbackController:(id<BCOVPlaybackController>)controller playbackSession:(id<BCOVPlaybackSession>)session didReceiveLifecycleEvent:(BCOVPlaybackSessionLifecycleEvent *)lifecycleEvent
      {
      	// Common parameters
      	NSString *video_name = session.video.properties[kBCOVVideoPropertyKeyName];
      	NSString *video_ID = session.video.properties[kBCOVVideoPropertyKeyId];
      
      	// Session is ready to play
      	if ([kBCOVPlaybackSessionLifecycleEventReady isEqualToString:lifecycleEvent.eventType])
      	{
      		[FIRAnalytics logEventWithName:@"bcov_video_ready"
      	    parameters:@{
      	        @"bcov_video_name": video_name,
      	        @"bcov_video_id": video_ID
      	    }];
      	}
      
      	// Session encountered an error
      	if ([kBCOVPlaybackSessionLifecycleEventError isEqualToString:lifecycleEvent.eventType])
      	{
      		NSError *error = lifecycleEvent.properties[kBCOVPlaybackSessionEventKeyError];
      		int error_code = error.code;
      		[FIRAnalytics logEventWithName:@"bcov_video_playback_error"
      	    parameters:@{
      	        @"bcov_video_name": video_name,
      	        @"bcov_video_id": video_ID,
      	        @"bcov_video_error_code": @(error_code)
      	    }];
      	}
      
      	// Session has completed
      	if ([kBCOVPlaybackSessionLifecycleEventTerminate isEqualToString:lifecycleEvent.eventType])
      	{
      		[FIRAnalytics logEventWithName:@"bcov_video_terminate"
      	    parameters:@{
      	        @"bcov_video_name": video_name,
      	        @"bcov_video_id": video_ID
      	     }];
      	}
      }
      // This snippet shows one way to track video playback
      // using the Firebase SDK from Google Analytics with
      // the lifecycle event playback controller delegate method.
      func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didReceive lifecycleEvent: BCOVPlaybackSessionLifecycleEvent!) {
              
          // Common parameters
          let video_name = session.video.properties[kBCOVVideoPropertyKeyName]
          let video_id = session.video.properties[kBCOVVideoPropertyKeyId]
          
          // Session is ready to play
          if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventReady) {
              
              if let video_name = video_name as? String, let video_id = video_id as? String {
                  Analytics.logEvent("bcov_video_ready", parameters: [
                      "bcov_video_name" : video_name,
                      "bcov_video_id" : video_id
                  ])
              }
              
          }
          
          // Session encountered an error
          if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventError) {
           
              if let error = lifecycleEvent.properties[kBCOVPlaybackSessionEventKeyError] as? NSError {
                  let error_code = error.code
                  
                  if let video_name = video_name as? String, let video_id = video_id as? String {
                      Analytics.logEvent("bcov_video_playback_error", parameters: [
                          "bcov_video_name" : video_name,
                          "bcov_video_id" : video_id,
                          "bcov_video_error_code" : error_code
                      ])
                  }
              }
              
          }
          
          // Session has completed
          if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventTerminate) {
              
              if let video_name = video_name as? String, let video_id = video_id as? String {
                  Analytics.logEvent("bcov_video_terminate", parameters: [
                      "bcov_video_name" : video_name,
                      "bcov_video_id" : video_id
                  ])
              }
              
          }
      
      }

     

    Limiting the bitrate

    You can't control which source (rendition) in the HLS manifest gets selected by the AVPlayer , but you can put a bitrate cap on playback. This prevents the player from using sources (renditions) with a bitrate over the specified bitrate.

    Set the preferredPeakBitRate to the desired limit, in bits per second, of the network bandwidth consumption for the given AVPlayerItem .

    Use one of the following declarations:

    [self.playbackController setPreferredPeakBitRate:1000];
    playbackController.setPreferredPeakBitRate(1000)

    Looping a video

    In some cases, you may want a video to automatically replay. To do this, you can get the "end of video" lifecycle event, seek to the beginning and play again.

    This code assumes that you have set the delegate of the playbackController to the object with this method:

    - (void)playbackController:(id<BCOVPlaybackController>)controller playbackSession:(id<BCOVPlaybackSession>)session didReceiveLifecycleEvent:(BCOVPlaybackSessionLifecycleEvent *)lifecycleEvent
    {
    	if ([kBCOVPlaybackSessionLifecycleEventEnd isEqualToString:lifecycleEvent.eventType])
    	{
    		[controller seekToTime:kCMTimeZero completionHandler:^(BOOL finished) {
    			if (finished)
    			{
    				[controller play];
    			}
    		}];
    	}
    }
    func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didReceive lifecycleEvent: BCOVPlaybackSessionLifecycleEvent!) {
            
        if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventEnd) {
            controller.seek(to: CMTime.zero) { (finished: Bool) in
                if (finished) {
                    controller.play()
                }
            }
        }
        
    }

    Managing videos in a playlist

    One way to manage a playlist of videos is to store the video objects in a table. When the user selects a video from the table, the table row will contain the video object.

    Here's an overview of how it works:

    1. Retrieve a playlist from your account.

      - (void)retrievePlaylist
      {
      	[self.refreshControl beginRefreshing];
      
      	// Retrieve a playlist through the BCOVPlaybackService
      	BCOVPlaybackServiceRequestFactory *playbackServiceRequestFactory = [[BCOVPlaybackServiceRequestFactory alloc]         
        initWithAccountId:kDynamicDeliveryAccountID
      	policyKey:kDynamicDeliveryPolicyKey];
      	BCOVPlaybackService *playbackService = [[BCOVPlaybackService alloc] initWithRequestFactory:playbackServiceRequestFactory];
      
      	[playbackService findPlaylistWithReferenceID:kDynamicDeliveryPlaylistRefID parameters:nil 
        completion:^(BCOVPlaylist *playlist, NSDictionary *jsonResponse, NSError *error) 
      	{
      		[self.refreshControl endRefreshing];
      
      		NSLog(@"JSON Response:\n%@", jsonResponse);
      
      		if (playlist)
      		{
      			self.currentVideos = playlist.videos.mutableCopy;
      			self.currentPlaylistTitle = playlist.properties[@"name"];
      			self.currentPlaylistDescription = playlist.properties[@"description"];
      
      			NSLog(@"Retrieved playlist containing %d videos", (int)self.currentVideos.count);
      
      			[self usePlaylist:self.currentVideos];
      		}
      		else
      		{
      			NSLog(@"No playlist for ID %@ was found.", kDynamicDeliveryPlaylistRefID);
      		}
      
      	}];
      }
      func retrievePlaylist() {
          refreshControl.beginRefreshing()
          
          let playbackServiceRequestFactory = BCOVPlaybackServiceRequestFactory(accountId: kDynamicDeliveryAccountID, policyKey: kDynamicDeliveryPolicyKey)
          let playbackService = BCOVPlaybackService(requestFactory: playbackServiceRequestFactory)
          
          playbackService?.findPlaylist(withPlaylistID: kDynamicDeliveryPlaylistRefID, parameters: nil, completion: { [weak self] (playlist: BCOVPlaylist?, jsonResponse: [AnyHashable:Any]?, error: Error?) in
              
              guard let strongSelf = self else {
                  return
              }
              
              strongSelf.refreshControl.endRefreshing()
              
              if let playlist = playlist {
                  strongSelf.currentVideos = playlist.videos
                  strongSelf.currentPlaylistTitle = playlist.properties["name"] as? String ?? ""
                  strongSelf.currentPlaylistDescription = playlist.properties["description"] as? String ?? ""
                  
                  print("Retrieved playlist containing \(playlist.videos.count) videos")
                  
                  strongSelf.usePlaylist(playlist)
              } else {
                  print("No playlist for ID \(kDynamicDeliveryPlaylistRefID) was found.");
              }
              
          })
      }
    2. Re-initialize the containers that store information related to the videos in the current playlist.

    3. When the table view is selected, the row's index is used to create a new videoDictionary. Next, ask the dictionary for the video. If the video is not null, then load the video in the playbackController.

      // Play the video in this row when selected
      - (IBAction)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(nonnull NSIndexPath *)indexPath
      {
      	NSDictionary *videoDictionary = self.videosTableViewData[ (int)indexPath.row ];
      	BCOVVideo *video = videoDictionary[@"video"];
      
      	if (video != nil)
      	{
      		[self.playbackController setVideos:@[ video ]];
      	}
      }
      func tableView(_ tableView: UITableView, didSelectRowAt indexPath: IndexPath) {
              
          guard let videosTableViewData = videosTableViewData, 
          let videoDictionary = videosTableViewData[indexPath.row] as? 
          [AnyHashable:Any], let video = videoDictionary["video"] as? BCOVVideo else {
              return
          }
          
          playbackController.setVideos([video] as NSFastEnumeration)
          
      }

    To work with playlists, you can store the playlist in another object such as a table. Based on user interaction, you can navigate the object's indices and select the appropriate video.

    Media progress values

    During media playback, the values reported to the Player SDK progress delegate method may include an initial value of negative infinity and a final value of positive infinity. These values are used when processing pre-roll and post-roll ads.

    If these values are not important to you or interfere with your own progress tracking, they can be easily ignored with a conditional statement like this:

    - (void)playbackController:(id<BCOVPlaybackController>)controller playbackSession:(id<BCOVPlaybackSession>)session didProgressTo:(NSTimeInterval)progress
    {
    	if (progress < 0.0 || progress >= INFINITY)
    	{
    		return;
    	}
    
    	// Your code here
    }
    func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didProgressTo progress: TimeInterval) {
            
        if (progress < 0.0 || progress >= Double.infinity) {
            return;
        }
        
        // Your code here
    }

    Reference

    For details, see the BCOVPlaybackController documentation.

    Modifying captions programmatically

    You can set the captions anytime during playback, after the Ready event has been received. To do this, you can use the BCOVPlaybackControllerDelegate.

    Here's an example of setting the captions language to Spanish:

    - (void)playbackController:(id<BCOVPlaybackController>)controller playbackSession:(id<BCOVPlaybackSession>)session didReceiveLifecycleEvent:(BCOVPlaybackSessionLifecycleEvent *)lifecycleEvent
      {
          if ([kBCOVPlaybackSessionLifecycleEventReady isEqualToString:lifecycleEvent.eventType])
          {
              AVMediaSelectionGroup *legibleMediaSelectionGroup = session.legibleMediaSelectionGroup;
              NSArray<AVMediaSelectionOption *> *options = [AVMediaSelectionGroup mediaSelectionOptionsFromArray:legibleMediaSelectionGroup.options withLocale:[NSLocale localeWithLocaleIdentifier:@"es"]];
              AVMediaSelectionOption *option = options.firstObject;
              session.selectedLegibleMediaOption = option;
          }
      }
    func playbackController(_ controller: BCOVPlaybackController?, playbackSession session: BCOVPlaybackSession?, didReceive lifecycleEvent: BCOVPlaybackSessionLifecycleEvent?) {
        if kBCOVPlaybackSessionLifecycleEventReady == lifecycleEvent?.eventType {
            if let legibleMediaSelectionGroup = session?.legibleMediaSelectionGroup.options {
                let locale = NSLocale(localeIdentifier: "es") as Locale
                let mediaSelectionOptions = AVMediaSelectionGroup.mediaSelectionOptions(from: legibleMediaSelectionGroup, with: locale)
                let mediaSelectionOption = mediaSelectionOptions.first
                session?.selectedLegibleMediaOption = mediaSelectionOption
            }
        }
    }

    Paging with the Playback API

    When retrieving your Video Cloud content from the Playback API, you can implement paging for a playlist.

    To page through a set of videos in a playlist, use the following request URL parameters:

    • limit - defines the number of videos to return from the Playback API
    • offset - sets the number of videos to skip in a playlist from the Playback API

    This example returns 6 videos starting with the 10th video in the playlist:

    NSDictionary *params = @{
    	@"limit": @6,
    	@"offset": @9
    };
    
    [playbackService findPlaylistWithReferenceID:playlistRefID parameters:params completion:^(BCOVPlaylist *playlist, NSDictionary *jsonResponse, NSError *error)
    { 
        // Your code here
    }];
    let params = [
        "limit": 6,
        "offset": 9
    ]
    
    playbackService?.findPlaylist(withPlaylistID: playlistRefID, parameters: params, completion: { (playlist: BCOVPlaylist?, jsonResponse: [AnyHashable:Any]?, error: Error?) in
        // Your code here
    })

    Programmatically adding cue points

    Video Cloud customers can add cue points to a video using Video Cloud Studio as shown in the Adding Cue Points to Videos document.

    You can also add cue points to your video programmatically. The code below adds quarterly interval cue points to the video returned from the Playback API:

    // programmatically add cue points to a video
    - (void)requestContentFromPlaybackService
    {
        [self.service findVideoWithVideoID:kViewControllerVideoID parameters:nil completion:^(BCOVVideo *video, NSDictionary *jsonResponse, NSError *error) {
            if (video)
            {
                // Get the video duration from the properties dictionary
                NSNumber *durationNumber = video.properties[@"duration"]; // milliseconds
                float duration = durationNumber.floatValue / 1000.0; // convert to seconds
                video = [video update:^(id<BCOVMutableVideo> mutableVideo)
                {
                    // Add quarterly interval cue points of your own type
                    BCOVCuePoint *cp1 = [[BCOVCuePoint alloc] initWithType:@"your cue point type" position:CMTimeMake(duration * 250, 1000)];
                    BCOVCuePoint *cp2 = [[BCOVCuePoint alloc] initWithType:@"your cue point type" position:CMTimeMake(duration * 500, 1000)];
                    BCOVCuePoint *cp3 = [[BCOVCuePoint alloc] initWithType:@"your cue point type" position:CMTimeMake(duration * 750, 1000)];
                    BCOVCuePoint *cp4 = [[BCOVCuePoint alloc] initWithType:@"your cue point type" position:CMTimeMake(duration * 1000, 1000)];
                    // Create new cue point collection using existing cue points and new cue points
                    NSMutableArray *newCuePoints = [[NSMutableArray alloc] initWithArray:mutableVideo.cuePoints.array];
                    [newCuePoints addObject:cp1];
                    [newCuePoints addObject:cp2];
                    [newCuePoints addObject:cp3];
                    [newCuePoints addObject:cp4];
                    mutableVideo.cuePoints = [[BCOVCuePointCollection alloc] initWithArray:newCuePoints];
                }];
                    
                [self.playbackController setVideos:@[ video ]];
            }
            else
            {
                NSLog(@"ViewController Debug - Error retrieving video: `%@`", error);
            }
        }];
    }
    // programmatically add cue points to a video
    func requestContentFromPlaybackService() {
        playbackService?.findVideo(withVideoID: kViewControllerVideoID, parameters: nil) 
        { [weak self] (video: BCOVVideo?, jsonResponse: [AnyHashable: Any]?, error: Error?) -> Void in
    
            if let error = error {
                print("ViewController Debug - Error retrieving video: `\(error.localizedDescription)`")
            }
            
            if let video = video {
                
                // Get the video duration from the properties dictionary
                guard let durationNumber = video.properties["duration"] as? NSNumber else {
                    return
                }
                
                let duration = durationNumber.floatValue / 1000.0; // convert to seconds
                
                let updatedVideo = video.update({ (mutableVideo: BCOVMutableVideo?) in
                    
                    guard let mutableVideo = mutableVideo else {
                        return
                    }
                    
                    // Add quarterly interval cue points of your own type
                    let cp1Position = CMTimeMake(value: Int64(duration * 250), timescale: 1000)
                    let cp1 = BCOVCuePoint(type: "your cue point type", position: cp1Position)!
                    let cp2Position = CMTimeMake(value: Int64(duration * 500), timescale: 1000)
                    let cp2 = BCOVCuePoint(type: "your cue point type", position: cp2Position)!
                    let cp3Position = CMTimeMake(value: Int64(duration * 750), timescale: 1000)
                    let cp3 = BCOVCuePoint(type: "your cue point type", position: cp3Position)!
                    let cp4Position = CMTimeMake(value: Int64(duration * 1000), timescale: 1000)
                    let cp4 = BCOVCuePoint(type: "your cue point type", position: cp4Position)!
                    
                    // Create new cue point collection using existing cue points and new cue points
                    var newCuePoints = [BCOVCuePoint]()
                    newCuePoints.append(cp1)
                    newCuePoints.append(cp2)
                    newCuePoints.append(cp3)
                    newCuePoints.append(cp4)
                    
                    mutableVideo.cuePoints = BCOVCuePointCollection(array: newCuePoints)
                })
                
                self?.playbackController.setVideos([updatedVideo] as NSFastEnumeration)
            }
            
        }
    }

     

    Note that the value of your cue point type can be any string value that you want, as long as you are not using any of the iOS plugins. For details, see the BCOVCuePoint Protocol Reference document.

    If you are using cue points with the IMA plugin, learn more about it in the VAST and VMAP/Server Side Ad Rules section of the IMA Plugin for the Native SDK for iOS notes. The IMA sample app shows you the value required for IMA ad cue points.

    The code below listens for your cue points and displays a message:

    // listen for cue points and display them
    -(void)playbackController:(id<BCOVPlaybackController>)controller playbackSession:(id<BCOVPlaybackSession>)session didPassCuePoints:(NSDictionary *)cuePointInfo
    {
        BCOVCuePointCollection *cpc = cuePointInfo[@"kBCOVPlaybackSessionEventKeyCuePoints"];
        for (BCOVCuePoint *cp in cpc.array)
        {
            if ([cp.type isEqualToString:@"your cue point type"])
            {
                NSLog(@"Found your cue point at %f", CMTimeGetSeconds(cp.position));
            }
        }
    }
    // listen for cue points and display them
    func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didPassCuePoints cuePointInfo: [AnyHashable : Any]!) {
        if let cpc = cuePointInfo[kBCOVPlaybackSessionEventKeyCuePoints] as? BCOVCuePointCollection {
            for cp in cpc.array() {
                if let cp = cp as? BCOVCuePoint {
                    if (cp.type == "your cue point type") {
                        print("Found your cue point at \(CMTimeGetSeconds(cp.position))")
                    }
                }
            }
        }
    }

    References

    For details, see the following:

    Removing the player

    There may be cases where you want to remove the player and the view.

    Deallocating the view controller that has ownership of a BCOVPlaybackController will deallocate the playback controller as well. To do this, remove the player view from its superview and set their playback controller pointer to nil.

    Here is a code example:

    [self.playerView removeFromSuperview];
    self.playerView = nil;
    self.playbackController = nil;
    playerView?.removeFromSuperview()
    playerView = nil
    playbackController = nil

    Setting audio behavior

    The audio session handles audio behavior at the app level. You can choose from several audio session categories and settings to customize your app's audio behavior.

    Choose the best audio session category for your app. For details, review Apple's documentation:

    Basic sample

    For our basic sample, we use AVAudioSessionCategoryPlayback. This plays audio even when the screen is locked and with the Ring/Silent switch set to silent. To keep it simple, we put the code for this in the App Delegate.

    NSError *categoryError = nil;
      BOOL success = [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&categoryError];
      if (!success)
      {
          // Handle error
      }
    var categoryError :NSError?
      var success: Bool
      do {
          try AVAudioSession.sharedInstance().setCategory(.playback)
          success = true
      } catch let error as NSError {
          categoryError = error
          success = false
      }
      if !success {
          // Handle error
      }

    Mix with other audio

    You may want to allow audio from other apps to be heard when the audio in your app is muted. To do this, you can configure the AVAudioSession in the view controller that has access to your current AVPlayer.

    For details, see the mixWithOthers category option.

    - (void)setUpAudioSession
      {
          NSError *categoryError = nil;
          BOOL success;
          
          // If the player is muted, then allow mixing.
          // Ensure other apps can have their background audio active when this app is in foreground
          if (self.currentPlayer.isMuted)
          {
              success = [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&categoryError];
          }
          else
          {
              success = [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback withOptions:0 error:&categoryError];
          }
          
          if (!success)
          {
              NSLog(@"AppDelegate Debug - Error setting AVAudioSession category.  Because of this, there may be no sound. `%@`", categoryError);
          }
      }
    func setUpAudioSession() {
      var categoryError :NSError?
      var success: Bool
      do {
          if let currentPlayer = currentPlayer {
              // If the player is muted, then allow mixing.
              // Ensure other apps can have their background audio active when this app is in foreground
              if currentPlayer.isMuted {
                  try AVAudioSession.sharedInstance().setCategory(.playback, options: .mixWithOthers)
              } else {
                  try AVAudioSession.sharedInstance().setCategory(.playback, options: AVAudioSession.CategoryOptions(rawValue: 0))
              }
          } else {
              try AVAudioSession.sharedInstance().setCategory(.playback, options: AVAudioSession.CategoryOptions(rawValue: 0))
          }
          
          success = true
      } catch let error as NSError {
          categoryError = error
          success = false
      }
      if !success {
          print("AppDelegate Debug - Error setting AVAudioSession category.  Because of this, there may be no sound. \(categoryError!)")
      }
    }

    Setting the playback rate

    To control the playback rate, you can set the rate property on the AVPlayer class exposed on the session.

    By default, the play rate can only be set on regular intervals (0.50, 0.67, 1.0, 1.25, 1.50 and 2.0). By setting the audioTimePitchAlgorithm, you can use more granular rate values (like 1.7). For more details, see this stackoverflow discussion.

    avPlayerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed;

    For a BCOVPlaybackSession , your code would look similar to this:

    - (void)playbackController:(id<BCOVPlaybackController>)controller
      playbackSession:(id<BCOVPlaybackSession>)session didReceiveLifecycleEvent:(BCOVPlaybackSessionLifecycleEvent *)lifecycleEvent
    {
        if ( [lifecycleEvent.eventType isEqualToString:kBCOVPlaybackSessionLifecycleEventReady] )
        {
            NSLog(@"%@ at time %f", kBCOVPlaybackSessionLifecycleEventReady, CMTimeGetSeconds([session.player currentTime]));
            AVPlayerItem *avPlayerItem = [session.player currentItem];
            avPlayerItem.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithmVarispeed;
            session.player.rate = 1.7;
        }
    }
    func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didReceive lifecycleEvent: BCOVPlaybackSessionLifecycleEvent!) {
        
        if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventReady) {
            let seconds = CMTimeGetSeconds(session.player.currentTime())
            print("kBCOVPlaybackSessionLifecycleEventReady at time \(seconds)")
            let avPlayerItem = session.player.currentItem
            avPlayerItem?.audioTimePitchAlgorithm = AVAudioTimePitchAlgorithm.varispeed
            session.player.rate = 1.7
        }
        
    }

     

    Setting VR Goggles mode for 360° videos

    When playing a 360° video, users can select the Video 360 button on the control bar to switch to VR Goggles mode. You may also want to do this programmatically, before playback starts. You can do this by updating the BCOVPlaybackController protocol's viewProjection property as follows:

    - (void)playbackController:(id<BCOVPlaybackController>)controller didAdvanceToPlaybackSession:(id<BCOVPlaybackSession>)session
    {
        BCOVVideo360ViewProjection *viewProjection = [self.playbackController.viewProjection copy];
        [viewProjection setProjectionStyle:BCOVVideo360ProjectionStyleVRGoggles];
        [self.playbackController setViewProjection:viewProjection];
    }
    func playbackController(_ controller: BCOVPlaybackController!, didAdvanceTo session: BCOVPlaybackSession!) {
        if let viewProjection: BCOVVideo360ViewProjection = controller.viewProjection.copy() as? BCOVVideo360ViewProjection {
            viewProjection.projectionStyle = BCOVVideo360ProjectionStyle.vrGoggles
            playbackController.viewProjection = viewProjection
        }
    }

     

    Changing the background color

    When playing a video in portrait mode, you may notice a black border on top and below the player. The player view is the size of the screen, but the video only takes up a small part of the center of the player view. The visible portions around the video is the background of the player layer.

    This is normal AVPlayer behavior. It shrinks your video to fit inside the player layer, and the rest is the player layer background.

    You can change the player layer background with the following code:

    - (void)playbackController:(id<BCOVPlaybackController>)controller didAdvanceToPlaybackSession:(id<BCOVPlaybackSession>)session
    {
        session.playerLayer.backgroundColor = UIColor.whiteColor.CGColor;
    }
    func playbackController(_ controller: BCOVPlaybackController!, didAdvanceTo session: BCOVPlaybackSession!) {
        session.playerLayer.backgroundColor = UIColor.white.cgColor
    }

     

    Setting the background color to white should look like this:

    Background color
    Custom background color

    Page last updated on 08 Oct 2021