In this topic, you will find a collection of code snippets which you may find useful as a reference while developing with the SDK. For more detailed solutions, see the iOS player samples.
If you use the Brightcove PlayerUI controls, you are all set. AirPlay functionality works out-of-the-box, allowing users to stream video to high-definition displays with Apple TV.
If you are using custom controls, you can follow these steps:
The BCOVPlayerUI sample code shows you how to customize the Brightcove player when using the Native SDK for iOS. For more details, see the Customizing PlayerUI controls section of the Native SDK reference document.
To customize the closed captions button using the PlayerUI, follow these steps:
Replace your values for the Policy Key, Account Id and Video Id. Select a video from your account that has text tracks.
Set up the player view with a standard VOD layout:
// Set up our player view. Create with a standard VOD layout.
guard let playerView = BCOVPUIPlayerView(playbackController:
self.playbackController, options: options, controlsView:
BCOVPUIBasicControlView.withVODLayout()) else {
return
}
// Set up our player view. Create with a standard VOD layout.
BCOVPUIPlayerView *playerView = [[BCOVPUIPlayerView alloc]
initWithPlaybackController:self.playbackController options:nil controlsView:
[BCOVPUIBasicControlView basicControlViewWithVODLayout] ];
The closedCaptionButton is declared as a BCOVPUIButton, which is a subclass of UIButton and adds three additional methods for customization. Whenever you customize BCOVPlayerUI controls, you should use the Native Player APIs wherever they are available. Your customized code should look similar to this:
if let ccButton = playerView.controlsView.closedCaptionButton {
ccButton.titleLabel?.font = UIFont.systemFont(ofSize: 14)
ccButton.primaryTitle = "CC"
ccButton.showPrimaryTitle(true)}
}
// Customize the CC button.
BCOVPUIButton *ccButton = playerView.controlsView.closedCaptionButton;
ccButton.titleLabel.font = [UIFont systemFontOfSize:14.];
ccButton.primaryTitle = @"CC";
[ccButton showPrimaryTitle:YES];
Displaying FairPlay content on an external screen
When an external display is connected to an iOS device using an AV adapter and HDMI cable, the default behavior is to mirror the iOS screen. The exception to this is when you use FairPlay-protected video, which Apple prevents from mirroring (WWDC 2015, Session 502).
To display FairPlay-protected videos, set the AVPlayer properties exposed through the Brightcove Playback Controller to allow FairPlay video to play on an external display. The video plays in full screen mode. Here is an example of setting these properties:
If you use the Brightcove player and the catalog class, video analytics will be automatically collected and will appear in your Video Cloud Analytics module. For additional metrics, you can add Google Analytics to your app.
To integrate Google Analytics with your app, follow these steps:
Here is one way you could use Google Analytics to track video playback using Google's Firebase SDK:
// This snippet shows one way to track video playback
// using the Firebase SDK from Google Analytics with
// the lifecycle event playback controller delegate method.
func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didReceive lifecycleEvent: BCOVPlaybackSessionLifecycleEvent!) {
// Common parameters
let video_name = session.video.properties[kBCOVVideoPropertyKeyName]
let video_id = session.video.properties[kBCOVVideoPropertyKeyId]
// Session is ready to play
if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventReady) {
if let video_name = video_name as? String, let video_id = video_id as? String {
Analytics.logEvent("bcov_video_ready", parameters: [
"bcov_video_name" : video_name,
"bcov_video_id" : video_id
])
}
}
// Session encountered an error
if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventError) {
if let error = lifecycleEvent.properties[kBCOVPlaybackSessionEventKeyError] as? NSError {
let error_code = error.code
if let video_name = video_name as? String, let video_id = video_id as? String {
Analytics.logEvent("bcov_video_playback_error", parameters: [
"bcov_video_name" : video_name,
"bcov_video_id" : video_id,
"bcov_video_error_code" : error_code
])
}
}
}
// Session has completed
if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventTerminate) {
if let video_name = video_name as? String, let video_id = video_id as? String {
Analytics.logEvent("bcov_video_terminate", parameters: [
"bcov_video_name" : video_name,
"bcov_video_id" : video_id
])
}
}
}
// This snippet shows one way to track video playback
// using the Firebase SDK from Google Analytics with
// the lifecycle event playback controller delegate method.
- (void)playbackController:(id<BCOVPlaybackController>)controller playbackSession:(id<BCOVPlaybackSession>)session didReceiveLifecycleEvent:(BCOVPlaybackSessionLifecycleEvent *)lifecycleEvent
{
// Common parameters
NSString *video_name = session.video.properties[kBCOVVideoPropertyKeyName];
NSString *video_ID = session.video.properties[kBCOVVideoPropertyKeyId];
// Session is ready to play
if ([kBCOVPlaybackSessionLifecycleEventReady isEqualToString:lifecycleEvent.eventType])
{
[FIRAnalytics logEventWithName:@"bcov_video_ready"
parameters:@{
@"bcov_video_name": video_name,
@"bcov_video_id": video_ID
}];
}
// Session encountered an error
if ([kBCOVPlaybackSessionLifecycleEventError isEqualToString:lifecycleEvent.eventType])
{
NSError *error = lifecycleEvent.properties[kBCOVPlaybackSessionEventKeyError];
int error_code = error.code;
[FIRAnalytics logEventWithName:@"bcov_video_playback_error"
parameters:@{
@"bcov_video_name": video_name,
@"bcov_video_id": video_ID,
@"bcov_video_error_code": @(error_code)
}];
}
// Session has completed
if ([kBCOVPlaybackSessionLifecycleEventTerminate isEqualToString:lifecycleEvent.eventType])
{
[FIRAnalytics logEventWithName:@"bcov_video_terminate"
parameters:@{
@"bcov_video_name": video_name,
@"bcov_video_id": video_ID
}];
}
}
Limiting the bitrate
You can't control which source (rendition) in the HLS manifest gets selected by the AVPlayer , but you can put a bitrate cap on playback. This prevents the player from using sources (renditions) with a bitrate over the specified bitrate.
Set the preferredPeakBitRate to the desired limit, in bits per second, of the network bandwidth consumption for the given AVPlayerItem .
In some cases, you may want a video to automatically replay. To do this, you can get the "end of video" lifecycle event, seek to the beginning and play again.
This code assumes that you have set the delegate of the playbackController to the object with this method:
func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didReceive lifecycleEvent: BCOVPlaybackSessionLifecycleEvent!) {
if (lifecycleEvent.eventType == kBCOVPlaybackSessionLifecycleEventEnd) {
controller.seek(to: CMTime.zero) { (finished: Bool) in
if (finished) {
controller.play()
}
}
}
}
One way to manage a playlist of videos is to store the video objects in a table. When the user selects a video from the table, the table row will contain the video object.
Here's an overview of how it works:
Retrieve a playlist from your account.
func retrievePlaylist() {
refreshControl.beginRefreshing()
let playbackServiceRequestFactory = BCOVPlaybackServiceRequestFactory(accountId: kDynamicDeliveryAccountID, policyKey: kDynamicDeliveryPolicyKey)
let playbackService = BCOVPlaybackService(requestFactory: playbackServiceRequestFactory)
playbackService?.findPlaylist(withConfiguration: configuration, parameters: nil, completion: { [weak self] (playlist: BCOVPlaylist?, jsonResponse: [AnyHashable:Any]?, error: Error?) in
guard let strongSelf = self else {
return
}
strongSelf.refreshControl.endRefreshing()
if let playlist = playlist {
strongSelf.currentVideos = playlist.videos
strongSelf.currentPlaylistTitle = playlist.properties["name"] as? String ?? ""
strongSelf.currentPlaylistDescription = playlist.properties["description"] as? String ?? ""
print("Retrieved playlist containing \(playlist.videos.count) videos")
strongSelf.usePlaylist(playlist)
} else {
print("No playlist for ID \(kDynamicDeliveryPlaylistRefID) was found.");
}
})
}
Re-initialize the containers that store information related to the videos in the current playlist.
When the table view is selected, the row's index is used to create a new videoDictionary. Next, ask the dictionary for the video. If the video is not null, then load the video in the playbackController.
func tableView(_ tableView: UITableView, didSelectRowAt indexPath: IndexPath) {
guard let videosTableViewData = videosTableViewData,
let videoDictionary = videosTableViewData[indexPath.row] as?
[AnyHashable:Any], let video = videoDictionary["video"] as? BCOVVideo else {
return
}
playbackController.setVideos([video] as NSFastEnumeration)
}
// Play the video in this row when selected
- (IBAction)tableView:(UITableView *)tableView didSelectRowAtIndexPath:(nonnull NSIndexPath *)indexPath
{
NSDictionary *videoDictionary = self.videosTableViewData[ (int)indexPath.row ];
BCOVVideo *video = videoDictionary[@"video"];
if (video != nil)
{
[self.playbackController setVideos:@[ video ]];
}
}
To work with playlists, you can store the playlist in another object such as a table. Based on user interaction, you can navigate the object's indices and select the appropriate video.
Media progress values
During media playback, the values reported to the Player SDK progress delegate method may include an initial value of negative infinity and a final value of positive infinity. These values are used when processing pre-roll and post-roll ads.
If these values are not important to you or interfere with your own progress tracking, they can be easily ignored with a conditional statement like this:
func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didProgressTo progress: TimeInterval) {
if (progress < 0.0 || progress >= Double.infinity) {
return;
}
// Your code here
}
- (void)playbackController:(id<BCOVPlaybackController>)controller playbackSession:(id<BCOVPlaybackSession>)session didProgressTo:(NSTimeInterval)progress
{
if (progress < 0.0 || progress >= INFINITY)
{
return;
}
// Your code here
}
Video Cloud customers can add cue points to a video using Video Cloud Studio as shown in the Adding Cue Points to Videos document.
You can also add cue points to your video programmatically. The code below adds quarterly interval cue points to the video returned from the Playback API:
// programmatically add cue points to a video
func requestContentFromPlaybackService() {
playbackService?.findVideo(withConfiguration: configuration, queryParameters: nil, completion: { [weak self] (video: BCOVVideo?, jsonResponse: [AnyHashable: Any]?, error: Error?) -> Void in
if let error = error {
print("ViewController Debug - Error retrieving video: `\(error.localizedDescription)`")
}
if let video = video {
// Get the video duration from the properties dictionary
guard let durationNumber = video.properties["duration"] as? NSNumber else {
return
}
let duration = durationNumber.floatValue / 1000.0; // convert to seconds
let updatedVideo = video.update({ (mutableVideo: BCOVMutableVideo?) in
guard let mutableVideo = mutableVideo else {
return
}
// Add quarterly interval cue points of your own type
let cp1Position = CMTimeMake(value: Int64(duration * 250), timescale: 1000)
let cp1 = BCOVCuePoint(type: "your cue point type", position: cp1Position)!
let cp2Position = CMTimeMake(value: Int64(duration * 500), timescale: 1000)
let cp2 = BCOVCuePoint(type: "your cue point type", position: cp2Position)!
let cp3Position = CMTimeMake(value: Int64(duration * 750), timescale: 1000)
let cp3 = BCOVCuePoint(type: "your cue point type", position: cp3Position)!
let cp4Position = CMTimeMake(value: Int64(duration * 1000), timescale: 1000)
let cp4 = BCOVCuePoint(type: "your cue point type", position: cp4Position)!
// Create new cue point collection using existing cue points and new cue points
var newCuePoints = [BCOVCuePoint]()
newCuePoints.append(cp1)
newCuePoints.append(cp2)
newCuePoints.append(cp3)
newCuePoints.append(cp4)
mutableVideo.cuePoints = BCOVCuePointCollection(array: newCuePoints)
})
self?.playbackController.setVideos([updatedVideo] as NSFastEnumeration)
}
}
}
// programmatically add cue points to a video
- (void)requestContentFromPlaybackService
{
[self.service findVideoWithConfiguration:configuration parameters:nil completion:^(BCOVVideo *video, NSDictionary *jsonResponse, NSError *error) {
if (video)
{
// Get the video duration from the properties dictionary
NSNumber *durationNumber = video.properties[@"duration"]; // milliseconds
float duration = durationNumber.floatValue / 1000.0; // convert to seconds
video = [video update:^(id<BCOVMutableVideo> mutableVideo)
{
// Add quarterly interval cue points of your own type
BCOVCuePoint *cp1 = [[BCOVCuePoint alloc] initWithType:@"your cue point type" position:CMTimeMake(duration * 250, 1000)];
BCOVCuePoint *cp2 = [[BCOVCuePoint alloc] initWithType:@"your cue point type" position:CMTimeMake(duration * 500, 1000)];
BCOVCuePoint *cp3 = [[BCOVCuePoint alloc] initWithType:@"your cue point type" position:CMTimeMake(duration * 750, 1000)];
BCOVCuePoint *cp4 = [[BCOVCuePoint alloc] initWithType:@"your cue point type" position:CMTimeMake(duration * 1000, 1000)];
// Create new cue point collection using existing cue points and new cue points
NSMutableArray *newCuePoints = [[NSMutableArray alloc] initWithArray:mutableVideo.cuePoints.array];
[newCuePoints addObject:cp1];
[newCuePoints addObject:cp2];
[newCuePoints addObject:cp3];
[newCuePoints addObject:cp4];
mutableVideo.cuePoints = [[BCOVCuePointCollection alloc] initWithArray:newCuePoints];
}];
[self.playbackController setVideos:@[ video ]];
}
else
{
NSLog(@"ViewController Debug - Error retrieving video: `%@`", error);
}
}];
}
Note that the value of your cue point type can be any string value that you want, as long as you are not using any of the iOS plugins. For details, see the BCOVCuePoint Protocol Reference document.
If you are using cue points with the IMA plugin, learn more about it in the VAST and VMAP/Server Side Ad Rules section of the IMA Plugin for the Native SDK for iOS notes. The IMA sample app shows you the value required for IMA ad cue points.
The code below listens for your cue points and displays a message:
// listen for cue points and display them
func playbackController(_ controller: BCOVPlaybackController!, playbackSession session: BCOVPlaybackSession!, didPassCuePoints cuePointInfo: [AnyHashable : Any]!) {
if let cpc = cuePointInfo[kBCOVPlaybackSessionEventKeyCuePoints] as? BCOVCuePointCollection {
for cp in cpc.array() {
if let cp = cp as? BCOVCuePoint {
if (cp.type == "your cue point type") {
print("Found your cue point at \(CMTimeGetSeconds(cp.position))")
}
}
}
}
}
// listen for cue points and display them
-(void)playbackController:(id<BCOVPlaybackController>)controller playbackSession:(id<BCOVPlaybackSession>)session didPassCuePoints:(NSDictionary *)cuePointInfo
{
BCOVCuePointCollection *cpc = cuePointInfo[@"kBCOVPlaybackSessionEventKeyCuePoints"];
for (BCOVCuePoint *cp in cpc.array)
{
if ([cp.type isEqualToString:@"your cue point type"])
{
NSLog(@"Found your cue point at %f", CMTimeGetSeconds(cp.position));
}
}
}
There may be cases where you want to remove the player and the view.
Deallocating the view controller that has ownership of a BCOVPlaybackController will deallocate the playback controller as well. To do this, remove the player view from its superview and set their playback controller pointer to nil.
The audio session handles audio behavior at the app level. You can choose from several audio session categories and settings to customize your app's audio behavior.
Choose the best audio session category for your app. For details, review Apple's documentation:
For our basic sample, we use AVAudioSessionCategoryPlayback. This plays audio even when the screen is locked and with the Ring/Silent switch set to silent. To keep it simple, we put the code for this in the App Delegate.
var categoryError :NSError?
var success: Bool
do {
try AVAudioSession.sharedInstance().setCategory(.playback)
success = true
} catch let error as NSError {
categoryError = error
success = false
}
if !success {
// Handle error
}
You may want to allow audio from other apps to be heard when the audio in your app is muted. To do this, you can configure the AVAudioSession in the view controller that has access to your current AVPlayer.
func setUpAudioSession() {
var categoryError :NSError?
var success: Bool
do {
if let currentPlayer = currentPlayer {
// If the player is muted, then allow mixing.
// Ensure other apps can have their background audio active when this app is in foreground
if currentPlayer.isMuted {
try AVAudioSession.sharedInstance().setCategory(.playback, options: .mixWithOthers)
} else {
try AVAudioSession.sharedInstance().setCategory(.playback, options: AVAudioSession.CategoryOptions(rawValue: 0))
}
} else {
try AVAudioSession.sharedInstance().setCategory(.playback, options: AVAudioSession.CategoryOptions(rawValue: 0))
}
success = true
} catch let error as NSError {
categoryError = error
success = false
}
if !success {
print("AppDelegate Debug - Error setting AVAudioSession category. Because of this, there may be no sound. \(categoryError!)")
}
}
- (void)setUpAudioSession
{
NSError *categoryError = nil;
BOOL success;
// If the player is muted, then allow mixing.
// Ensure other apps can have their background audio active when this app is in foreground
if (self.currentPlayer.isMuted)
{
success = [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionMixWithOthers error:&categoryError];
}
else
{
success = [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback withOptions:0 error:&categoryError];
}
if (!success)
{
NSLog(@"AppDelegate Debug - Error setting AVAudioSession category. Because of this, there may be no sound. `%@`", categoryError);
}
}
Setting the playback rate
To control the playback rate, you can set the rate property on the AVPlayer class exposed on the session.
By default, the play rate can only be set on regular intervals (0.50, 0.67, 1.0, 1.25, 1.50 and 2.0). By setting the audioTimePitchAlgorithm, you can use more granular rate values (like 1.7). For more details, see this stackoverflow discussion.
When playing a 360° video, users can select the Video 360 button on the control bar to switch to VR Goggles mode. You may also want to do this programmatically, before playback starts. You can do this by updating the BCOVPlaybackController protocol's viewProjection property as follows:
This example will help you to create code to start video playback when the player is in-view and stop playback when it's out-view.
The code below returns a float value for the percentage of the player view that is visible. You can decide when to take action. For example, continue playback when the view is 50% or more visible. Pause playback when the view is under 50% visible. Use an NSTimer or something similar to check the view's visibility over time.
When playing a video in portrait mode, you may notice a black border on top and below the player. The player view is the size of the screen, but the video only takes up a small part of the center of the player view. The visible portions around the video is the background of the player layer.
This is normal AVPlayer behavior. It shrinks your video to fit inside the player layer, and the rest is the player layer background.
You can change the player layer background with the following code: