Hey there,
We're seeing a high rate of 403 - Invalid Authentication on this endpoint v1/me/library/artists since a few days.
Does anyone have the same issue ?
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
We have a Low-Latency HLS stream, and on iOS 26, even though the bandwidth is sufficient, it still selects a low-bandwidth resolution (e.g., RESOLUTION=640x360) for playback instead of using a higher-bandwidth resolution (e.g., RESOLUTION=1920x1080) when using AVPlayerViewController with AVPlayer.
This works fine on iOS version 18 and previous versions. What could be the solution to this issue?
Topic:
Media Technologies
SubTopic:
Streaming
Hi, I submitted the FairPlay Streaming Credentials Approval request, but it's been 15 days and I haven't received a response yet. Do you happen to know how long they usually take to reply to these requests?
Hello,
I'm investigating an issue with LL-HLS playback using AVPlayer, specifically during DVR Live seeking (seeking to a past time).
I noticed that in certain seeking scenarios, AVPlayer sends a Blocking Playlist Reload request that includes the _HLS_msn parameter but is missing the _HLS_part parameter.
While I understand this is compliant with the HLS spec, I would like to know the specific criteria AVPlayer uses to decide when to drop the _HLS_part parameter. Does AVPlayer intentionally omit the part info when it determines that loading a specific partial segment is unnecessary during a seek operation?
Clarification on this behavior would help us greatly in debugging our stream delivery.
Thanks in advance.
I am working on Screen Record function in Apple Vision Pro, when I use broadcast upload extension, after I click record button, the XCode console show the exception:
<<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606
we create and config the project as flow:
Create a Apple Vision Project.
Create a Broadcast Upload Extension Target.
Add App Group for Project Target and Extension Target, both use the same identifier.
Add "Main Camera Access", "Passthrough in Screen Capture" Capabilities for all targets.
Add "NSScreenCaptureUsageDescription", "NSMicrophoneUsageDescription" in Plist.
Add record button in view
Run debug in Apple Vision Pro device, after click record button, throw the exception.
Hi
Is it possible to have a playlist where I have a indication of a stream in clear, but then, someone started a DRM encrypted period and then someone turns it off.
Can I just do the following (I've removed the video segments part, I'm just interested in the parts where I want notify the new drm region )?
#EXT-X-MAP:URI="video_2_10000000_t17586401730000000_init.mp4"
#EXT-X-KEY:METHOD=NONE
...
#EXT-X-MAP:URI="video_2_10000000_t17587374640000000_init.mp4"
#EXT-X-KEY:METHOD=SAMPLE-AES,URI="skd://5df0b36ac4bb4d0ff954a73b502ac332",KEYFORMAT="com.apple.streamingkeydelivery",KEYFORMATVERSIONS="1"
...
#EXT-X-MAP:URI="video_2_10000000_t17587376740000000_init.mp4?"
#EXT-X-KEY:METHOD=NONE
Should I insert discontinuity tags or something else?
Right now what I can observe is that I got some audio drops when I try to do this.
Our license service is based on version 4.5.4 and we make use of sample .c/.h files for building license service.
We are told that version 4.5.4 is going to be deprecated in 2026 and we should migrate to latest SDK version 26.
When explored the SDK, we noticed that only python and Swift based SDk is provided.
Does Apple also provide C/C++ based SDK as it is going to easier for us to integrate.
If yes, please share the SDK package and sample license service solution.
The same H265 encrypted Fairplay content can be played in all Apple devices except A1625.
The clear H265 content is played in A1625.
The question is: will this model (A1625) support H265 Fairplay encrypted content?
A ticket was created here:
https://discussions.apple.com/thread/255658006?sortBy=best
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media
Video
HTTP Live Streaming
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me:
Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode}
Thanks!
We are having issues with ScreenCaptureKit. Our use case is the following:
We have multiple applications that each starts a stream capture, using ScreenCaptureKit. It works fine when just one application is streaming, but when starting multiple streams continuesly, all streams stops or crashes, without ScreenCaptureKit reporting an error back.
Restarting replayd for the user will allow us to start streaming again, if the streaming applications are restartet too.
We have build a small test program that we have tested on different Macs, running different versions of macOS, with identical results. The test program just calls the 'getShareableContentExcludingDesktopWindows' function multiple times, since this was the simplest way to show the problem.
Our test setup is the following:
Mac Mini M2 macOS 13.6
MacBook Pro M3 macOS 14.4
MacBook Pro M3 macOS 15.1
Code main.m
#import <Foundation/Foundation.h>
#import <ScreenCaptureKit/ScreenCaptureKit.h>
@interface Runner : NSObject
@property(atomic, assign) BOOL keepRunning;
@property(atomic, retain) SCShareableContent* availableWindows;
-(void) updateAvailableWindows;
-(void) notif:(NSNotification*) aNotif;
@end
int main(int argc, const char * argv[]) {
@autoreleasepool {
NSRunLoop* loo = [NSRunLoop mainRunLoop];
Runner* r = [[Runner alloc] init];
[r updateAvailableWindows];
while (r.keepRunning)
[loo runUntilDate:[NSDate distantFuture]];
NSLog(@"Program exit");
}
return 0;
}
@implementation Runner
-(instancetype) init
{
self = [super init];
if(self){
_keepRunning = YES;
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasNotFound" object:nil];
[[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(notif:) name:@"windowWasFound" object:nil];
}
return self;
}
-(void) updateAvailableWindows
{
@autoreleasepool {
NSLog(@"begin");
[SCShareableContent getShareableContentExcludingDesktopWindows:NO onScreenWindowsOnly:YES completionHandler:^(SCShareableContent* content, NSError* error){
NSLog(@"running");
if(!error){
self.availableWindows = content;
for (__unused SCWindow* aWin in self.availableWindows.windows) {
[NSThread sleepForTimeInterval:0.01];
}
[[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasFound" object:self.availableWindows];
}
else{
[[NSNotificationCenter defaultCenter] postNotificationName:@"windowWasNotFound" object:nil];
}
}];
}
}
-(void) notif:(NSNotification*) aNotif
{
if(aNotif.object)
[self performSelectorOnMainThread:@selector(updateAvailableWindows) withObject:nil waitUntilDone:NO];
else{
NSLog(@"Not Found");
self.keepRunning = NO;
}
}
@end
How to replicate:
Compile and run the program from multiple terminal windows (terminal must be granted screen recording permission) and notice that the output stops, when the replayd stops responding (our assumption). Restarting the application does nothing, before the replayd also is restarted.
Running th application in Xcode gives the following error:
[ERROR] -[RPDaemonProxy fetchShareableContentWithOption:windowID:currentProcess:withCompletionHandler:]_block_invoke:902 error: 4097
This error is not something we have been able to detect in our applications - and since the only workaround is restarting replayd and the applications, catching the error would help us.
Hello!
Just curious if AVAssetWriter append() randomly fails for anyone but me? Seems to happen when live streaming (encoding with VTCompressionSession) and recording (with AVAssetWriter) at the same time.
Topic:
Media Technologies
SubTopic:
Streaming
No external cameras show up in the app on visionOS. We use this sample code as a basis for our tests: https://developer.apple.com/documentation/visionos/displaying-video-from-connected-devices
We also received the needed entitlement from Apple, but every camera we tried so far does not show up on visionOS.
We tried the following devices and hubs:
Insta360 X4
Somikon Endoscope Camera: USB HD Endoscope Camera
EMEET Full HD Webcam - C960
BENFEI Video/Audio Capture Card, 4K HDMI auf USB C/A
Logitech C920 HD PRO Webcam,
Anker PowerConf C200
Insta360 GO 3S
Anker 341 USB-C Hub
UGREEN Revodok Pro 10Gbps USB-C Hub
All Vision Pro devices we tried run with visionOS 2.3. When trying the same code on iPad we can actually use external cameras.
Steps to reproduce:
Start the app on a Vision Pro device and connect an external camera. The connected camera does not show up in the dropdown.
Development environment:
Xcode 16.2, macOS 15.3
Run-time configuration:
iOS 18.3, visionOS 2.3
I am would like to look at AVMetricEvent data during video playback, so I have added this code to a test video player app:
let playerItem: AVPlayerItem = ...
let allMetrics = playerItem.allMetrics()
Task.init {
print("metrics task started")
do {
for try await metricEvent in allMetrics {
print("metric event: \(metricEvent.description)")
}
} catch {
print("unexpected metric iterator error \(error)")
}
}
Running this in Simulator on iPhone 16 Pro (18.0) does not result in any "metric event" diagnostic messages being printed when the video associated with this AVPlayerItem is playing. Only the "metric task started" diagnostic message is seen.
What am I doing wrong that prevents metric data being received?
FairPlay-Protected HLS Files Not Transferred via Quick Start I have an iOS app that downloads HLS files, which are protected by FairPlay. These files are stored locally, and their locations are managed using Core Data. When playing these tracks, I use AVURLAsset to access the stored file paths.
Recently, a client upgraded to a new iPhone and used Quick Start to transfer data from his old device. While all other app data was successfully transferred, including Core Data records and UserDefaults, the actual HLS files were missing. As a result, the app retained metadata about the downloaded content, but the files themselves were gone, causing playback failures.
Does Quick Start exclude certain types of locally stored files, especially DRM-protected HLS downloads, or is the issue related to how FairPlay-protected content is handled during the transfer of locally stored files?
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
HTTP Live Streaming
AVFoundation
The app registers a periodic time observer to the AVPlayer when the playback starts and it works fine. When switching to AirPlay during playback, the periodic time observation continues working as expected.
However, when switching back to local playback, the periodic time observer does not fire anymore until a seek is performed. The app removes the periodic time observer only when the playback stops.
I can see that when switching back to local playback, the timeControlStatus successively changes
to .waitingToPlayAtSpecifiedRate (reason: .evaluatingBufferingRate)
then to .waitingToPlayAtSpecifiedRate (reason: .toMinimizeStalls)
and finally to .playing
But the time observation does not work anymore.
Also, the issue is systematic with Live and VOD streams providing a program date (with HLS property #EXT-X-PROGRAM-DATE-TIME), with or without any DRM, and is never reproduced with other VOD streams.
Hi everyone! I’ve been working with AVFoundation and trying to use the AVMetricEventStreamPublisher to discover media performance metrics, as described in the Apple documentation.
https://developer.apple.com/cn/videos/play/wwdc2024/10113/?time=508
However, when following the example code, I’m not getting the expected results. The performance metrics for both audio and video don’t seem to be captured properly.
Has anyone successfully used this example code? If so, could you share your experience or any solutions you’ve found? Any tips or insights would be greatly appreciated. Thanks in advance!
Ps. the example code:
AVPlayerItem *item = ...
AVMetricEventStream *eventStream = [AVMetricEventStream eventStream];
id subscriber = [[MyMetricSubscriber alloc] init];
[eventStream setSubscriber:subscriber queue:mySerialQueue]
[eventStream subscribeToMetricEvent:[AVMetricPlayerItemLikelyToKeepUpEvent class]];
[eventStream subscribeToMetricEvent:[AVMetricPlayerItemPlaybackSummaryEvent class]];
[eventStream addPublisher:item];
the problem is when using HLS live stream with AVPlayer on iOS/ tvOS the player chooses first highest bandwidth then slowly steps down to lowest (within 1-3min) and eventually steps up again then repeats to step down.
the AVPlayer error log sends events:
errorStatusCode: -12888, errorDomain: Optional("CoreMediaErrorDomain"), errorComment: Optional("The operation couldn't be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration
we use standard segments in CMAF format, 2sec duration
#EXTM3U
#EXT-X-VERSION:6
#EXT-X-TARGETDURATION:2
#EXT-X-MEDIA-SEQUENCE:147065903
#EXT-X-MAP:URI="video_1_4660000_init.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2"
#EXT-X-PROGRAM-DATE-TIME:2025-04-30T12:51:07
#EXTINF:2.000,
video_1_4660000_t17460174670001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2
#EXTINF:2.000,
video_1_4660000_t17460174690001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2
#EXTINF:2.000,
video_1_4660000_t17460174710001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2
when using 6sec segments the player stays stable at highest bandwidth.
is there a way to avoid this error? in AVPlayer or HLS configuration?
TL;DR How to solve possible racing issue of EXT-X-SESSION-KEY request and encrypted media segment request?
I'm having trouble using custom AVAssetResourceLoaderDelegate with video manifest containing VideoProtectionKey(VPK). My master manifest contains rendition manifest url and VPK url. When not using custom resource delegate, everything works fine.
My custom resource delegate is implemented in way where it first append prefix to scheme of the master manifest url before creating the asset. And during handling master manifest, it puts back original scheme, make the request, modify the scheme for rendition manifest url in the response content by appending the same prefix again, so that rendition manifest request also goes into custom resource loader delegate. Same goes for VPK request. The AES-128 key is stored in memory within custom resource loader delegate object. So far so good.
The VPK is requested before segment request. But the problem comes where the media segment requests happen. The media segment request url from rendition manifest goes into custom resource loader as well and those are encrypted. I can see segment request finish first then the related VPK requests kick in after a few seconds. The previous VPK value is cached in memory so it is not network causing the delay but some mechanism that I'm not aware of causing this.
So could anyone tell me what would be the proper way of handling this situation? The native library is handling it well so I just want to know how. Thanks in advance!
Hi everyone,
We’re currently developing a music-based app using MusicKit, and we recently noticed that iOS 26 beta introduces a new “Automix” feature in the Apple Music app. This enables seamless DJ-style transitions between songs—beyond the standard crossfade functionality.
We’re trying to understand:
Will this Automix feature be accessible to third-party apps that use MusicKit?
If not available in the initial iOS 26 release, is there a plan to expose it through public APIs in a future update?
Is there any technical documentation, WWDC session, or roadmap info regarding Automix support via MusicKit?
This functionality would be a significant enhancement for our app, especially for intelligent audio transitions and curated playlists.
Thanks.
The documentation for the Apple Music API indicates that the genreNames field for a given artist (see https://developer.apple.com/documentation/applemusicapi/artists/attributes-data.dictionary) is an array of strings. However, it only appears as though you return ONE SINGLE GENRE per Artist, regardless of how many genres might be attached to that artist's albums.
Am I missing something? Is there an artist where multiple genres may be returned, or is this a bug in the documentation?