ApplicationMusicPlayer with queue created from playlist crashes with random occurrence shortly after skipping back or forth using controls embedded in the notification, with the error on console log: applicationController: xpc service connection interrupted.
I've noticed that the issue occurs more frequently the shorter is time between skipping entries. Since ApplicationMusicPlayer is run on a remote process, the main app does not crash, but the music stops playing without any exception, and the playback control turns uninitiated.
Here is how I'm initiating the queue:
let entries = playlist
.with(.entries).entries!
.map { ApplicationMusicPlayer.Queue.Entry($0) }
ApplicationMusicPlayer.shared.queue = .init(
entries, startingAt: entries.last
)
Please give me some tips on how to solve this.
EDIT:
The issue does not occur when navigating quickly through the station.
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am reaching out regarding an issue with my Apple FairPlay Streaming Certificate. To generate the certificate signing request (CSR), I used the following OpenSSL commands:
openssl genrsa -out private_key.pem 1024 openssl req -new -key private_key.pem -out request.csr
However, according to the guide provided by Apple and instructions from my DRM provider, I should have used:
openssl genrsa -aes256 -out privatekey.pem 1024 openssl req -new -sha1 -key privatekey.pem -out certreq.csr -subj "/CN=SubjectName /OU=OrganizationalUnit /O=Organization /C=US"
I suspect this discrepancy might be causing the issue with my FairPlay certificate. After obtaining the fairplay.cer file and importing it into Keychain Access, I noticed the following:
When I expand the certificate in Keychain Access, I can only see a public key and no private key.
As a result, I am unable to export the certificate as a .p12 file, as this option is disabled.
As per my DRM provider's instructions, I need to export the certificate along with the corresponding private key as a .p12 file with a password. Since the private key is not visible in Keychain Access, I am unable to proceed further.
I have read the FairPlay Streaming Overview but could not find any reasons as to why this issue is occurring or guidance on the procedure to revoke a certificate.
Additionally, I came across the terms and conditions which mentioned reaching out to product-security at Apple for assistance in revoking corrupt certificates. However, despite reaching out, I have not received a response.
Any help on how to proceed will be great!
In our Apple TV application, we are using the native AVPlayer for live playback functionality. During live restart playback, we intermittently encounter an error when the playback timeline approaches the actual live event end time.
Error:
The operation couldn’t be completed. (CoreMediaErrorDomain error -16839 - Unable to get playlist before long download timer) / Failure reason:
Scenario:
The live event is scheduled from 7:00 AM to 8:00 AM.
Restart playback begins at 7:20 AM, allowing the user to watch the event from the start while the live stream continues in real-time.
As the restart playback timeline approaches the actual event end time (8:00 AM), AVPlayer displays an error, and playback continues in the background.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media Player
HTTP Live Streaming
It seems currently 5G Network Slicing is only available on URL Request / URL Session / Low level network and not on AVPlayer, is it possible to have the app using default slice when start the app and only enable Network Slicing when streaming video via AVPlayer?
Hello, we have HLS Stream app on Apple TV. Our streams are DRM protected. We have problem with streams when source device is turned off. For example, user start to watch our HLS DRM Protected content. After some time, user turns off device (it can be Monitor or TV via connected HDMI). Our app does not understand HDMI Source device turned off. Is there any way to understand HDMI connected device is turned off on Swift?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Swift
Apple TV
HTTP Live Streaming
Hi folks,
When doing HLS v6 live streaming with fmp4 chunks we noticed that when the encoder timestamps slightly drift and a #EXT-X-DISCONTINUITY tag is created in either the audio or video playlist (in an ABR setup), the tag is not correctly handled by the player leading to a broken playback containing black screen or no audio (depending on which playlist the tag is printed in).
We noticed that this is often true when the number of tags is odd between the playlists (eg. the audio playlist contains 1 tag and the video contains 2 tags will result in a black screen with audio).
By using the same "broken" source but using Shaka player instead won't break the playback at all.
Are there any possible fix (or upcoming) for AV Player?
I want my iOS app to be able to use USB client mode to send LiDAR data and camera frames to another device. What are my options for doing this. I've found IOUSBHost for host mode, but I want to see all my options. The device I want driving the bus is a Meta Quest 3, which, and I mention for the sake of clarity, is inherently an Android device. The iPhone is to be used as a sensor hub, sending data to the Quest 3 for further processing.
As a workaround, I could let the iPhone drive the bus and have the Quest 3 use Android's accessory mode, which lets other devices drive the USB bus. But, there are more USB devices I want to attach for my project, and doing this makes such more difficult. I want to avoid it.
Topic:
Media Technologies
SubTopic:
Streaming
I am developing an app to stream and download DRM protected HLS videos based on the official “FairPlay Streaming Server SDK”.
When I play the downloaded video, it asks the server for .ts or .aac, even though I have passed the path of the downloaded video to AVURLAsset.
As a result, playback fails when the device is offline, such as in airplane mode.
This behavior depends on the playback time of the video and occurs when trying to download and play a video with a playback time of 19 hours or more.
It did not occur for videos with a playback time of 18 hours.
The environment we checked is iOS 18.3.
The solution at this time is to limit the video playback time to 18 hours, but if possible, we would like to allow download playback of videos longer than 19 hours.
Does anyone have any information or know of a solution to this problem, such as if you have experienced this type of event, or if you know that content longer than 19 hours cannot be played offline?
// load
let path = ".../xxx.movpkg" // Path of the downloaded file
videoAsset = AVURLAsset(url: path)
playerItem = AVPlayerItem(asset: videoAsset!)
player.replaceCurrentItem(with: playerItem)
// isPlayableOffline
print("videoAsset.assetCache.isPlayableOffline = \(videoAsset.assetCache.isPlayableOffline)") // true
Our team conducted security testing and found one vulnerability with fairplay license acquisition.
Our QA engineer manually changed the device's system date and time (setting it 4 days into the future) and was able to successfully obtain a license response and initiate playback on an iOS device. However, on an Android device, the license acquisition failed.
Can you please tell us if Time Manipulation Detection is available in FairPlay SDK?
Hello, Apple video engineers.
According to the official documentation, HLS is built on HTTP and traditionally ran on top of TCP. However, with the introduction of HTTP/3, which uses QUIC (runs on top of UDP), I would like to clarify the following:
Has the official HLS specification changed in a way that allows it to be considered UDP-based when using HTTP/3? And is it fair to say that HLS supports UDP since the transport can go over HTTP/3 and QUIC?
Would it be more accurate to say that HLS remains HTTP-dependent, and the transport protocol (TCP or QUIC) only determines how HTTP requests are delivered?
My thoughts: Since HTTP/3 uses QUIC running over UDP, we still can't say that HLS supports UDP in a classical way, as it is introduced in RTP, RTSP, SRT.
I can't play video content with HEVC and DRM. Tested HEVC only: OK. Tested DRM+AVC: Ok.
Tested 2 players (Clappr/Stevie and BitMovin)
Master, variants and EXT-X-MAPs are downloaded Ok, DRM keys Ok and then, for instance with BitMovin Player:
[BMP] [Player] [Error] Event: SourceError, Data: {"code":2001,"data":{"message":"The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","code":-12927},"message":"Source Error. The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","timestamp":1740320663.4505711,"type":"onSourceError"} code: 2001 [Data code: -12927, message: The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.), underlying error: Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"]
4k-master.m3u8.txt
4k.m3u8.txt
4k-audio.m3u8.txt
I am playing the protected HLS streams and the authorization token expires in 3 minutes. I am trying to achieve this with 'AVAssetResourceLoaderDelegate'. I can refresh the token and play it, but the problem is in between the session, the player stalls for a small time, LIKE 1 SECOND.
Here's my code :
class APLCustomAVARLDelegate: NSObject, AVAssetResourceLoaderDelegate {
static let httpsScheme = "https"
static let redirectErrorCode = 302
static let badRequestErrorCode = 400
private var token: String?
private var retryDictionary = [String: Int]()
private let maxRetries = 3
private func schemeSupported(_ scheme: String) -> Bool {
let supported = ishttpSchemeValid(scheme)
print("Scheme '\(scheme)' supported: \(supported)")
return supported
}
private func reportError(loadingRequest: AVAssetResourceLoadingRequest, error: Int) {
let nsError = NSError(domain: NSURLErrorDomain, code: error, userInfo: nil)
print("Reporting error: \(nsError)")
loadingRequest.finishLoading(with: nsError)
}
// Handle token renewal requests to prevent playback stalls
func resourceLoader(_ resourceLoader: AVAssetResourceLoader, shouldWaitForRenewalOfRequestedResource renewalRequest: AVAssetResourceRenewalRequest) -> Bool {
print("Resource renewal requested for URL: \(renewalRequest.request.url?.absoluteString ?? "unknown URL")")
// Handle renewal the same way we handle initial requests
guard let scheme = renewalRequest.request.url?.scheme else {
print("No scheme found in the renewal URL.")
return false
}
if isHttpsSchemeValid(scheme) {
return handleHttpsRequest(renewalRequest)
}
print("Scheme not supported for renewal.")
return false
}
private func isHttpsSchemeValid(_ scheme: String) -> Bool {
let isValid = scheme == APLCustomAVARLDelegate.httpsScheme
print("httpsScheme scheme '\(scheme)' valid: \(isValid)")
return isValid
}
private func generateHttpsURL(sourceURL: URL) -> URL? {
// If you need to modify the URL, do it here
// Currently this just returns the same URL
let urlString = sourceURL.absoluteString
print("Generated HTTPS URL: \(urlString)")
return URL(string: urlString)
}
private func handleHttpsRequest(_ loadingRequest: AVAssetResourceLoadingRequest) -> Bool {
print("Handling HTTPS request.")
guard let sourceURL = loadingRequest.request.url,
var redirectURL = generateHttpsURL(sourceURL: sourceURL) else {
print("Failed to generate HTTPS URL.")
reportError(loadingRequest: loadingRequest, error: APLCustomAVARLDelegate.badRequestErrorCode)
return true
}
// Track retry attempts with a dictionary keyed by request URL
let urlString = sourceURL.absoluteString
let currentRetries = retryDictionary[urlString] ?? 0
if currentRetries < maxRetries {
retryDictionary[urlString] = currentRetries + 1
} else {
// Too many retries, report a more specific error
reportError(loadingRequest: loadingRequest, error: NSURLErrorTimedOut)
retryDictionary.removeValue(forKey: urlString)
return true
}
if var urlComponents = URLComponents(url: redirectURL, resolvingAgainstBaseURL: false) {
var queryItems = urlComponents.queryItems ?? []
// Generate a fresh token each time
let freshToken = AESTimeBaseEncription.secureEncryptSecretText()
// Check if the token already exists
if let existingTokenIndex = queryItems.firstIndex(where: { $0.name == "token" }) {
// Update the existing token
queryItems[existingTokenIndex].value = freshToken
} else {
// Add the token if it doesn't exist
queryItems.append(URLQueryItem(name: "token", value: freshToken))
}
urlComponents.queryItems = queryItems
redirectURL = urlComponents.url!
}
let redirectRequest = URLRequest(url: redirectURL)
let response = HTTPURLResponse(url: redirectURL, statusCode: APLCustomAVARLDelegate.redirectErrorCode, httpVersion: nil, headerFields: nil)
print("Redirecting HTTPS to URL: \(redirectURL)")
loadingRequest.redirect = redirectRequest
loadingRequest.response = response
loadingRequest.finishLoading()
// If successful, reset the retry counter
if retryDictionary[urlString] == maxRetries {
retryDictionary.removeValue(forKey: urlString)
}
return true
}
}
Hello! I am trying to determine the best approach with AVPlayer for implementing auto-play, that is, playback that automatically starts without user initiation. Ideally this would work for both local and streaming audio.
My current approach is using KVO and the status on an AVPlayerItem equal to readyToPlay to do this, but I was wondering if there was a better property or state to use, or, alternatively, whether this use case may already be handled when automaticallyWaitsToMinimizeStalling is true, so that I could simply write:
player.replaceCurrentItem(with: AVPlayerItem(url: streamingUrl))
player.rate = 1
or
let playerItem = AVPlayerItem(url: streamingUrl)
player = AVPlayer(playerItem: playerItem)
player.rate = 1
and expect the item to be auto-played when ready.
In the context of user-initiated playback, I've typically seen code that makes a button's enabled state contingent on player.currentItem.duration, e.g. in AVFoundationSimplePlayer-iOS. On the other hand, AVAutoWait, which utilizes automaticallyWaitsToMinimizeStalling, does not seem to do this.
As a side note, I am not using an AVQueuePlayer.
Topic:
Media Technologies
SubTopic:
Streaming
We noticed the behaviour of expiration of FairPlay license changed from iOS 16.x to some version iOS 17 and the latest iOS 18.3.2 and Safari.
On iOS 16.x, the video playback will stop when the license expires, but on iOS 17.x + the video continues but no audio and no error fired.
On latest Safari the video and audio all continues.
Any changes for the latest FairPlay and how we adapt this from the license server?
Thanks
t has been quite some time since I requested the Apple FPS package, yet I haven’t received it. I haven’t received any email either. Is there a developer support inquiry center where I can check the status of the process? Alternatively, could you share approximately how long it took for you to receive a response email?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
Accounts
FairPlay Streaming
Video
HTTP Live Streaming
In our logging tools (Firebase) I see a lot of errors reported when users are playing content and the app transitions to the background. A AVPlayerItemFailedToPlayToEndTime notification is fired with an error containing error codes like -1102 and 1852797029 which seem to correspond to NSURLErrorNoPermissionsToReadFile and kCMIOHardwareIllegalOperationError respectively. To me, it looks like these might have something to do with caching logic.
The items being played are HLS streams and we make use of AVAssetDownloadTask to make any streamed content offline available. Our setup is similar to the sample provided here: https://developer.apple.com/documentation/avfoundation/using-avfoundation-to-play-and-persist-http-live-streams. Whenever an item is selected for playback the app will check if a cached version is available and if so gets the url to the stored file like the "localAssetForStream()" method in the example, or get the asset from a currently running AVAssetDownloadTask for the item, or else, starts a new AVAssetDownloadTask and returns a AVAsset from that task to play.
This seems to work fine, and I can't reproduce the issues our users and our logging tools are reporting.
Is there some case I am missing where AVAssetDownloadTask and associated AVAssets might become unreadable when the app transitions to the background? Or do these errors indicate a different problem entirely?
Topic:
Media Technologies
SubTopic:
Streaming
A few months ago, I had the opportunity to receive a 2018 iMac, and I’ve been using it to create content for my social media. I was truly impressed by the power of its processors. Even with this older model, I’ve been able to grow my presence online—something I couldn’t achieve with newer computers from other brands that I previously purchased.
I would love to become a promoter of your brand in the gaming world. All I ask for is technological support with more recent equipment and a minimal payment for collaborating with you. I am genuinely interested in being part of your company and leveraging the potential and reputation of Apple to reach even greater heights.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
GameplayKit
External Graphics Processors
Developer Tools
I am playing FairPlay + Multi-Key content (fMP4) in Safari browser.
I want to implement the implementation to distinguish between SD and HD video quality, and play it in HD if HDCP is supported, and in SD if HDCP is not supported.
I have already confirmed that HDCP support is the default, and that a black screen is output in non-HDCP environments.
What I want is to improve the user experience by appropriately switching to SD/HD depending on HDCP support when playing DRM content.
Question: Is there an API or function that can detect HDCP support in Safari through JavaScript or other methods? Or is there a way to indirectly guess it?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
WebKit
Safari
HTTP Live Streaming
I’m building a professional camera app where users can customize the video recording format and color grading. In the func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) method, I handle video frames and use Metal for real-time color grading. This works well when device.activeColorSpace is sRGB or P3, and the results are great. However, when the color space is HLG_BT2020 or appleLog, the MTKTextureLoader.newTexture(cgImage: cgImage, options: options) method throws an error. After researching, I found that the video frame in these color spaces has a bit-per-channel (bpc) greater than 8 after being converted to CGImage, causing the texture creation to fail. I tried converting the CGImage to a lower bpc to successfully create the texture, but the final output image is garbled and not as expected. Is there a solution to this issue?
We are encountering an issue where AVPlayer throws the error:
Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" > Underlying Error Domain[(null)]:Code[0]:Desc[(null)]
This error seems to occur intermittently during video playback, especially after extended usage or when switching between different streams. We observe Error 11819 (AVFoundationErrorDomain) in Conviva platform that some of our users experience it but we couldn't reproduce it so far and we’re need support to determine the root cause and/or best practices to prevent it.
Some questions we have:
What typically triggers this error?
Could it be related to memory/resource constraints, network instability, or backgrounding?
Are there any recommended ways to handle or recover from this error gracefully?
Any insights or guidance would be greatly appreciated. Thanks!
Topic:
Media Technologies
SubTopic:
Streaming