Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

CIRAWFilter.outputImage first-time cost is huge (~3s), subsequent calls are ~3ms. Any official way to pre-initialize RAW pipeline (without taking a real photo)?
Hi Apple Developer Forums, I’m developing an iOS camera app that processes RAW captures using Core Image. I’m seeing a large “first use” performance penalty specifically when creating the CIImage from CIRAWFilter.outputImage. What’s slow (important detail) I’m measuring the time for: let rawFilter = CIRAWFilter(imageData: rawData, identifierHint: hint) let ciImage = rawFilter.outputImage This is not CIContext.render(...) / createCGImage(...). It’s just the time to access outputImage (i.e., building the Core Image graph / RAW pipeline setup). Observed behavior First time accessing CIRAWFilter.outputImage: ~3 seconds Second time (same app session, similar RAW): ~3 milliseconds So something heavy is happening only on first use (decoder initialization, pipeline setup, shader/library compilation, caching, etc.). Using Metal System Trace, I also noticed that during the slow first call there are many “Create MTLLibrary” events, while the second call doesn’t show this pattern. Warm-up attempts using bundled DNG I tried to “warm up” early (e.g., on camera screen entry) by loading a bundled DNG and then accessing CIRAWFilter.outputImage by taking a photo: Warm-up with a ~247 KB DNG → first real RAW outputImage cost drops to ~1.42s Warm-up with a ~25 MB DNG → first real RAW outputImage cost drops to ~843ms This helps, but it’s still far from the steady-state ~3ms. Warm-up by capturing a real RAW (works, but concerns) The only method that fully eliminates the delay is to trigger a real RAW capture programmatically before the user’s first photo, then use that captured rawData to warm up the CIRAWFilter.outputImage path. This brings the first user-facing capture close to the steady-state timing. However: In some regions, the camera shutter sound cannot be suppressed, so “hidden warm-up capture” is unacceptable UX. I’m also unsure whether triggering a real capture without an explicit user action could raise compliance/privacy concerns, even if the image is immediately discarded and never saved/uploaded. Questions Is the large first-time cost of CIRAWFilter.outputImage expected (RAW pipeline initialization / shader compilation)? Is there an Apple-recommended way to pre-initialize the Core Image RAW pipeline / Metal resources so the first outputImage is fast, without taking a real photo? Are there any best practices (e.g. CIContext creation timing, prepareRender(...), specific options) that reliably reduce this first-use overhead for CIRAWFilter? Attachments Figure 1: First RAW capture with no warm-up (~3s outputImage time) Figure 2: First RAW capture after warm-up with bundled DNG (improved but still hundreds of ms) Thanks for any guidance or experience sharing!
3
2
650
Jan ’26
Unexpected artist names in song table
Hi team, In the Apple Music Feed datasets, we've noticed some unexpected values in the song and album tables. The primaryartists column from either song or album may contain a "non-default" artist name such as the katakana name shown in the example below: select id, name, namedefault, primaryartists from amf_song where id = '1698723329' id | name | namedefault | primaryartists ---------------------------------------- 1698723329 | {default=California} | California | [{id=1264818718, name=チャペル・ローン}] select * from amf_artist where id = '1264818718' id | name | namedefault | namepronunciation | ---------------------------------------------- 1264818718 | {default=Chappell Roan, ja=チャペル・ローン} | Chappell Roan | {ja=チャペルローン} | Shouldn't the primaryartists column be showing the namedefault instead of the Japanese language version? When can we expect this bug to resolved? Thanks,
0
2
514
Dec ’25
PhotosPickerItem.itemIdentifier always nil
I'm using the SwiftUI Photos Picker to select videos from the users Photos library and then opening the video using the PhotosPickerItem. I'm looking for a way to allow the user to open the same video on their other devices as the app uses SwiftData and CloudKit to provide access to a recently watched list of videos. The URL from the PhotosPickerItem appears to be device specific and so I was looking to see if I can use the itemIdentifier and then the init that takes the itemIdentifier to create the PhotosPickerItem on the other devices. The itemIdentifier however is always nil and so wouldn't be able to be used in this way. Is there an alternative approach whereby the users can open a video using a PhotosPickerItem and that item would be viewable on their other devices with an item identifier or a URL that is device agnostic. This approach should also not involve copying the video into other storage as it would simply expand the use of the users iCloud storage, providing a less than ideal user experience. If the user has opened the video from their Photos library, there should be a way to allow the same user (e.g. same Apple ID), to use the same app on another device to open the video again.
1
0
687
Nov ’25
PDF Page Content Swapping on iOS 26
Dear Apple Developer Team, On iOS 26, the contents of PDF pages appear to be swapped. Could you please advise if there is a workaround or a planned fix for this issue? Steps to Reproduce: Download the attached PDF on iOS 26. Open the PDF in the Files app. Tap the PDF to view it in Quick Look. Navigate to page 5. Expected Result: The page number displayed at the bottom should be 5. Actual Result: The page number displayed at the bottom is 4. Issue: This is not limited to page 5—multiple page contents appear to be swapped. I have also submitted feedback via Feedback Assistant (FB20743531) on October 20. Best regards, Yoshihito Suezawa
3
0
420
Nov ’25
MusicKit for Android reports an error when playing stations
我在使用 musicKit SDK for Android 1.1.2 时,发现 MediaContainerType 只定义了三种类型: 无 = 0; 专辑 = 1; 播放列表 = 2; 未定义 RADIO_STATION 类型。 但是,com.apple.android.music.playback.model 的文档指出支持 RADIO_STATION 类型。 此问题在我传入 stations ID 后会导致错误: MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException 请问如何解决这个问题?
1
1
86
May ’25
Background Upload Extension
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application. 1 - We implemented the code to upload still images. We would like to do the same for Live Photos but we are not sure how to proceed since a Live Photo is composed of 2 resources that we would like to upload in the same job so that we keep the association between the 2 resources. Steps: A Live Photo is captured on the device. Our application is notified for new content in the Photo Library and the asset is queued for upload by our application. The system calls the background upload extension and the Live Photo is prepared for upload but we can schedule a job only for one resource (still photo or paired video) so we can not upload both resources in a single job. 2 - Is there a way to synchronise the application and the extension so that the application would not process the same data or execute the same requests than the extension. For example: our application is working with tokens and we would like to prevent those tokens to be consumed by the application and the extension at the same time. 3 - is there a command in xcode or terminal to start or stop the extension, something similar to what exists for processing tasks (https://developer.apple.com/documentation/backgroundtasks/starting-and-terminating-tasks-during-development).
3
1
640
Feb ’26
Uploading PhotoKit Resources in the Background
The introduction of PHBackgroundResourceUploadExtension is a welcome addition in iOS 26.1. I wonder however, how to attach a debugger and actually get the system to call the process() method of the extension. I tried to run the extension both inside photos app (and also the main app for testing), but when I take a photo or add photos to the library (saving), the process() method does never get called. Any hints would be appreciated to debug the PHBackgroundResourceUploadExtension during development.
0
1
268
Nov ’25
LockedCameraCaptureExtension and Sharing User Preferences
I have the main app that saves preferences to UserDefaults.standard. So I have this one preference that the user is able to toggle - isRawOn UserDefaults.standard.set(self.isRawOn, forKey: "isRawOn") Now, I have LockedCameraCaptureExtension which is required know if that above setting on or off during launch. Also if it's toggled within the extension, the main app should know about it on the next launch. The main app and the extension runs on separate containers and the preferences are not shared due to privacy reasons. Apple mentions of using appContext of CameraCaptureIntent, but not sure how above scenario is possible through that....unless I am missing something. Apple Reference What I have for CameraCaptureIntent: @available(iOS 18, *) struct LaunchMyAppControlIntent: CameraCaptureIntent { typealias AppContext = MyAppContext static let title: LocalizedStringResource = "LaunchMyAppControlIntent" static let description = IntentDescription("Capture photos with MyApp.") @MainActor func perform() async throws -> some IntentResult { .result() } }
1
0
597
Nov ’25
_MediaPlayer_AppIntents compilation error for iOS 26
getting an interesting error attempting to compile my app in Xcode 26 beta. error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher') note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher') Unable to find module dependency: '_MediaPlayer_AppIntents' Not sure what to try and pull to fix this issue
1
1
156
Jun ’25
Control system video effects support for CMIO extension
We're distributing a virtual camera with our app that does not profit in the slightest from automatically applied system video effects both to the video going in (physical camera device) or out (virtual camera device). I'm aware of setting NSCameraReactionEffectGesturesEnabledDefault in Info.plist and determining active video effects via AVCaptureDevice API. Those are obviously crutches, because having to tell users to go look for and click around in menu bar apps is the opposite of a great UX. To make our product's video output more deterministic, I'm looking for a way to tell the CMIO subsystem that our virtual camera does not support any of the system video effects. I'm seeing properties like AVCaptureDevice.Format.isPortraitEffectSupported and AVCaptureDevice.Format.isStudioLightSupported whose documentation refers to the format's ability to support these effects. Since we're setting a CMFormatDescription via CMIOExtensionStreamSource.formats I was hoping to find something in the extensions, but wasn't successful so far. Can this be done?
2
0
303
Nov ’25
AVPlayer HLS High Bitrate Problem on Apple TV HD (A1625) tvOS 26
Hello, We have Video Stream app. It has HLS VOD Content. We supply 1080p, 4K Contents to users. Users were watching 1080p content before tvOS 26. Users can not watch 1080p content anymore when they update to tvOS 26. We have not changed anything at HLS playlist side and application version. This problem only occurs on Apple TV 4th Gen (A1625) tvOS 26 version. There is no problem with newer Apple TV devices. Would you help to resolve problem? Thanks in advance
2
1
632
Oct ’25
Inquiry regarding CoreMediaErrorDomain Code=-12880 in LL-HLS playback
Hello, I am developing a custom player SDK based on AVPlayer. While testing LL-HLS streams, I intermittently encounter the following error: Error Domain=CoreMediaErrorDomain Code=-12880 Since I cannot find documentation for this specific code, could you please clarify its meaning? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Any insights would be appreciated. Thank you.
1
0
181
Feb ’26
Mute behavior of Volume button on AVPlayerViewController iOS 26
With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the sound volume of device to the level when user muted before. On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons). As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system. But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior. I need some bases to explain this situation. Thank you.
0
1
197
Oct ’25
Accessing External Timecode from Blackmagic ProDock in Custom App
Hi everyone, I’m exploring using the iPhone 17 Pro with the Blackmagic ProDock in a custom capture app. The genlock functionality seems accessible via AVExternalSyncDevice and related APIs, which is great. I’m specifically curious about external timecode coming in from the ProDock: • Is there a public way to access the timecode feed in a custom app via AVFoundation or another Apple API? • If so, what is the recommended approach to read or apply that timecode during capture? • Are there any current limitations or entitlements required to access timecode from ProDock in a third-party app? I’m excited to start integrating synchronized capture in my app, and any guidance or sample patterns would be greatly appreciated. Thanks in advance! — [Artem]
2
0
318
Oct ’25
CoreMediaErrorDomain -12888: Bandwidth down-stepping when using 2sec segment duration
the problem is when using HLS live stream with AVPlayer on iOS/ tvOS the player chooses first highest bandwidth then slowly steps down to lowest (within 1-3min) and eventually steps up again then repeats to step down. the AVPlayer error log sends events: errorStatusCode: -12888, errorDomain: Optional("CoreMediaErrorDomain"), errorComment: Optional("The operation couldn't be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration we use standard segments in CMAF format, 2sec duration #EXTM3U #EXT-X-VERSION:6 #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:147065903 #EXT-X-MAP:URI="video_1_4660000_init.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2" #EXT-X-PROGRAM-DATE-TIME:2025-04-30T12:51:07 #EXTINF:2.000, video_1_4660000_t17460174670001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174690001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174710001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 when using 6sec segments the player stays stable at highest bandwidth. is there a way to avoid this error? in AVPlayer or HLS configuration?
2
0
395
May ’25
New API to control front camera orientation like native Camera app on iPhone 17 with iOS 26?
The iPhone 17’s front camera with the new 18MP square sensor and iOS 26’s Center Stage feature can auto-rotate between portrait and landscape like the native iOS Camera app. Is there a Swift or AVFoundation API that allows developers to manually control front camera orientation in the same way the native Camera app does? Or is this auto-rotation strictly handled by the system without public API access?
0
1
341
Oct ’25
Apple Music API no longer returns standalone singles as “single” albums
I’ve been using Apple Music API for quite a while now and a recent change must have happened which is quite disruptive. On many occasions, artists release singles from an album as part of promoting this album. For recent examples, Harry Styles released “Aperture” (a single) to promote his upcoming album “Kiss All The Time. Disco, Occasionally“. Similarly, Bruno Mars released “I Just Might”, a single from the upcoming album “The Romantic”. Previously, those would return at the endpoint ”artists/{artist_id}/albums” with a “- Single” suffix. But it seems a recent change happens where they only appear as playable tracks inside the album. This behavior is also evident in the Apple Music app itself. Those singles no longer appear under “Singles & EPs”. Instead, they would only be visible if the single becomes popular enough to be shown on “Top Songs“. Otherwise one would have to know to tap on the (future) album to discover if there are released singles. Meanwhile Spotify’s API returns those as singles properly, just like Apple Music API used to. This change must be recent but the question is if it’s intentional, and if so, how can the API be used from here on out to “extract” those singles and represent them?
0
1
307
Jan ’26
Process to request the restricted entitlement behind “DJ with Apple Music” (tempo control / time-stretch on Apple Music streams)?
Hi, I’m an iOS developer building an app with an use case that needs advanced playback on Apple Music subscription streams, specifically: • Real-time tempo change (BPM) during playback — i.e., time-stretch with key-lock, not just crossfade. • Beat-matched transitions between tracks. From what I can tell, this capability seems to exist only for approved partners and isn’t available through public MusicKit. Question: What’s the official request path to be evaluated for that restricted partner entitlement (application form, questionnaire, NDA, or internal team/BD contact)? If the entitlement identifier is internal, how can I get my account routed to the right Apple Music team? For reference, publicly announced partners include Algoriddim djay, Serato DJ Pro, rekordbox (AlphaTheta), and Engine DJ—all of which appear to implement mixing features that imply advanced playback (tempo/beat-matching) on Apple Music content. I’d prefer not to share product details publicly for the moment and can provide specifics privately if needed. Thanks in advance!
0
1
377
Oct ’25
Optimizing UICollectionView Scrolling Performance and High-Quality Image Loading with PHCachingImageManager
Hello, I'm developing an app that displays a photo library using UICollectionView and PHCachingImageManager. I'd like to achieve a user experience similar to the native iOS Photos app, where low-quality images are shown quickly while scrolling, and higher-quality images are loaded for visible cells once scrolling stops. I'm currently using the following approach: While Scrolling: I'm using the UICollectionViewDataSourcePrefetching protocol. In the prefetchItemsAt method, I call startCachingImages with low-quality options to cache images in advance. After Scrolling Stops: In the scrollViewDidEndDecelerating method, I intend to load high-quality images for the currently visible cells. I have a few questions regarding this approach: What is the best practice for managing both low-quality and high-quality images efficiently with PHCachingImageManager? Is it correct to call startCachingImages with fastFormat options and then call it again with highQualityFormat in scrollViewDidEndDecelerating? How can I minimize the delay when a low-quality image is replaced by a high-quality one? Are there any additional strategies to help pre-load high-quality images more effectively?
0
1
270
Aug ’25
CIRAWFilter.outputImage first-time cost is huge (~3s), subsequent calls are ~3ms. Any official way to pre-initialize RAW pipeline (without taking a real photo)?
Hi Apple Developer Forums, I’m developing an iOS camera app that processes RAW captures using Core Image. I’m seeing a large “first use” performance penalty specifically when creating the CIImage from CIRAWFilter.outputImage. What’s slow (important detail) I’m measuring the time for: let rawFilter = CIRAWFilter(imageData: rawData, identifierHint: hint) let ciImage = rawFilter.outputImage This is not CIContext.render(...) / createCGImage(...). It’s just the time to access outputImage (i.e., building the Core Image graph / RAW pipeline setup). Observed behavior First time accessing CIRAWFilter.outputImage: ~3 seconds Second time (same app session, similar RAW): ~3 milliseconds So something heavy is happening only on first use (decoder initialization, pipeline setup, shader/library compilation, caching, etc.). Using Metal System Trace, I also noticed that during the slow first call there are many “Create MTLLibrary” events, while the second call doesn’t show this pattern. Warm-up attempts using bundled DNG I tried to “warm up” early (e.g., on camera screen entry) by loading a bundled DNG and then accessing CIRAWFilter.outputImage by taking a photo: Warm-up with a ~247 KB DNG → first real RAW outputImage cost drops to ~1.42s Warm-up with a ~25 MB DNG → first real RAW outputImage cost drops to ~843ms This helps, but it’s still far from the steady-state ~3ms. Warm-up by capturing a real RAW (works, but concerns) The only method that fully eliminates the delay is to trigger a real RAW capture programmatically before the user’s first photo, then use that captured rawData to warm up the CIRAWFilter.outputImage path. This brings the first user-facing capture close to the steady-state timing. However: In some regions, the camera shutter sound cannot be suppressed, so “hidden warm-up capture” is unacceptable UX. I’m also unsure whether triggering a real capture without an explicit user action could raise compliance/privacy concerns, even if the image is immediately discarded and never saved/uploaded. Questions Is the large first-time cost of CIRAWFilter.outputImage expected (RAW pipeline initialization / shader compilation)? Is there an Apple-recommended way to pre-initialize the Core Image RAW pipeline / Metal resources so the first outputImage is fast, without taking a real photo? Are there any best practices (e.g. CIContext creation timing, prepareRender(...), specific options) that reliably reduce this first-use overhead for CIRAWFilter? Attachments Figure 1: First RAW capture with no warm-up (~3s outputImage time) Figure 2: First RAW capture after warm-up with bundled DNG (improved but still hundreds of ms) Thanks for any guidance or experience sharing!
Replies
3
Boosts
2
Views
650
Activity
Jan ’26
Unexpected artist names in song table
Hi team, In the Apple Music Feed datasets, we've noticed some unexpected values in the song and album tables. The primaryartists column from either song or album may contain a "non-default" artist name such as the katakana name shown in the example below: select id, name, namedefault, primaryartists from amf_song where id = '1698723329' id | name | namedefault | primaryartists ---------------------------------------- 1698723329 | {default=California} | California | [{id=1264818718, name=チャペル・ローン}] select * from amf_artist where id = '1264818718' id | name | namedefault | namepronunciation | ---------------------------------------------- 1264818718 | {default=Chappell Roan, ja=チャペル・ローン} | Chappell Roan | {ja=チャペルローン} | Shouldn't the primaryartists column be showing the namedefault instead of the Japanese language version? When can we expect this bug to resolved? Thanks,
Replies
0
Boosts
2
Views
514
Activity
Dec ’25
PhotosPickerItem.itemIdentifier always nil
I'm using the SwiftUI Photos Picker to select videos from the users Photos library and then opening the video using the PhotosPickerItem. I'm looking for a way to allow the user to open the same video on their other devices as the app uses SwiftData and CloudKit to provide access to a recently watched list of videos. The URL from the PhotosPickerItem appears to be device specific and so I was looking to see if I can use the itemIdentifier and then the init that takes the itemIdentifier to create the PhotosPickerItem on the other devices. The itemIdentifier however is always nil and so wouldn't be able to be used in this way. Is there an alternative approach whereby the users can open a video using a PhotosPickerItem and that item would be viewable on their other devices with an item identifier or a URL that is device agnostic. This approach should also not involve copying the video into other storage as it would simply expand the use of the users iCloud storage, providing a less than ideal user experience. If the user has opened the video from their Photos library, there should be a way to allow the same user (e.g. same Apple ID), to use the same app on another device to open the video again.
Replies
1
Boosts
0
Views
687
Activity
Nov ’25
Are frames returned in presentation or decode order with AVAssetReader
I read somewhere that the frames are returned in decode order instead of presentation order when using AVAssetReader. The documentation seems sparse on the subject. I have so far failed to find a video file where the frames are not returned in presentation order. Can anyone confirm the frames are actually returned in decode order?
Replies
1
Boosts
1
Views
108
Activity
Feb ’26
PDF Page Content Swapping on iOS 26
Dear Apple Developer Team, On iOS 26, the contents of PDF pages appear to be swapped. Could you please advise if there is a workaround or a planned fix for this issue? Steps to Reproduce: Download the attached PDF on iOS 26. Open the PDF in the Files app. Tap the PDF to view it in Quick Look. Navigate to page 5. Expected Result: The page number displayed at the bottom should be 5. Actual Result: The page number displayed at the bottom is 4. Issue: This is not limited to page 5—multiple page contents appear to be swapped. I have also submitted feedback via Feedback Assistant (FB20743531) on October 20. Best regards, Yoshihito Suezawa
Replies
3
Boosts
0
Views
420
Activity
Nov ’25
MusicKit for Android reports an error when playing stations
我在使用 musicKit SDK for Android 1.1.2 时,发现 MediaContainerType 只定义了三种类型: 无 = 0; 专辑 = 1; 播放列表 = 2; 未定义 RADIO_STATION 类型。 但是,com.apple.android.music.playback.model 的文档指出支持 RADIO_STATION 类型。 此问题在我传入 stations ID 后会导致错误: MediaSessionManager com.apple.android.music.sdk.testapp D onPlaybackError() Quincy java.io.IOException 请问如何解决这个问题?
Replies
1
Boosts
1
Views
86
Activity
May ’25
Background Upload Extension
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application. 1 - We implemented the code to upload still images. We would like to do the same for Live Photos but we are not sure how to proceed since a Live Photo is composed of 2 resources that we would like to upload in the same job so that we keep the association between the 2 resources. Steps: A Live Photo is captured on the device. Our application is notified for new content in the Photo Library and the asset is queued for upload by our application. The system calls the background upload extension and the Live Photo is prepared for upload but we can schedule a job only for one resource (still photo or paired video) so we can not upload both resources in a single job. 2 - Is there a way to synchronise the application and the extension so that the application would not process the same data or execute the same requests than the extension. For example: our application is working with tokens and we would like to prevent those tokens to be consumed by the application and the extension at the same time. 3 - is there a command in xcode or terminal to start or stop the extension, something similar to what exists for processing tasks (https://developer.apple.com/documentation/backgroundtasks/starting-and-terminating-tasks-during-development).
Replies
3
Boosts
1
Views
640
Activity
Feb ’26
Uploading PhotoKit Resources in the Background
The introduction of PHBackgroundResourceUploadExtension is a welcome addition in iOS 26.1. I wonder however, how to attach a debugger and actually get the system to call the process() method of the extension. I tried to run the extension both inside photos app (and also the main app for testing), but when I take a photo or add photos to the library (saving), the process() method does never get called. Any hints would be appreciated to debug the PHBackgroundResourceUploadExtension during development.
Replies
0
Boosts
1
Views
268
Activity
Nov ’25
LockedCameraCaptureExtension and Sharing User Preferences
I have the main app that saves preferences to UserDefaults.standard. So I have this one preference that the user is able to toggle - isRawOn UserDefaults.standard.set(self.isRawOn, forKey: "isRawOn") Now, I have LockedCameraCaptureExtension which is required know if that above setting on or off during launch. Also if it's toggled within the extension, the main app should know about it on the next launch. The main app and the extension runs on separate containers and the preferences are not shared due to privacy reasons. Apple mentions of using appContext of CameraCaptureIntent, but not sure how above scenario is possible through that....unless I am missing something. Apple Reference What I have for CameraCaptureIntent: @available(iOS 18, *) struct LaunchMyAppControlIntent: CameraCaptureIntent { typealias AppContext = MyAppContext static let title: LocalizedStringResource = "LaunchMyAppControlIntent" static let description = IntentDescription("Capture photos with MyApp.") @MainActor func perform() async throws -> some IntentResult { .result() } }
Replies
1
Boosts
0
Views
597
Activity
Nov ’25
_MediaPlayer_AppIntents compilation error for iOS 26
getting an interesting error attempting to compile my app in Xcode 26 beta. error: Unable to find module dependency: '_MediaPlayer_AppIntents' (in target 'icatcher' from project 'icatcher') note: A dependency of main module 'MainModuleCrossImportOverlays' (in target 'icatcher' from project 'icatcher') Unable to find module dependency: '_MediaPlayer_AppIntents' Not sure what to try and pull to fix this issue
Replies
1
Boosts
1
Views
156
Activity
Jun ’25
Control system video effects support for CMIO extension
We're distributing a virtual camera with our app that does not profit in the slightest from automatically applied system video effects both to the video going in (physical camera device) or out (virtual camera device). I'm aware of setting NSCameraReactionEffectGesturesEnabledDefault in Info.plist and determining active video effects via AVCaptureDevice API. Those are obviously crutches, because having to tell users to go look for and click around in menu bar apps is the opposite of a great UX. To make our product's video output more deterministic, I'm looking for a way to tell the CMIO subsystem that our virtual camera does not support any of the system video effects. I'm seeing properties like AVCaptureDevice.Format.isPortraitEffectSupported and AVCaptureDevice.Format.isStudioLightSupported whose documentation refers to the format's ability to support these effects. Since we're setting a CMFormatDescription via CMIOExtensionStreamSource.formats I was hoping to find something in the extensions, but wasn't successful so far. Can this be done?
Replies
2
Boosts
0
Views
303
Activity
Nov ’25
AVPlayer HLS High Bitrate Problem on Apple TV HD (A1625) tvOS 26
Hello, We have Video Stream app. It has HLS VOD Content. We supply 1080p, 4K Contents to users. Users were watching 1080p content before tvOS 26. Users can not watch 1080p content anymore when they update to tvOS 26. We have not changed anything at HLS playlist side and application version. This problem only occurs on Apple TV 4th Gen (A1625) tvOS 26 version. There is no problem with newer Apple TV devices. Would you help to resolve problem? Thanks in advance
Replies
2
Boosts
1
Views
632
Activity
Oct ’25
Inquiry regarding CoreMediaErrorDomain Code=-12880 in LL-HLS playback
Hello, I am developing a custom player SDK based on AVPlayer. While testing LL-HLS streams, I intermittently encounter the following error: Error Domain=CoreMediaErrorDomain Code=-12880 Since I cannot find documentation for this specific code, could you please clarify its meaning? Specifically, I would like to know if this is a critical error that disrupts playback, or if it is just a warning that can be safely ignored. Any insights would be appreciated. Thank you.
Replies
1
Boosts
0
Views
181
Activity
Feb ’26
Mute behavior of Volume button on AVPlayerViewController iOS 26
With older iOS versions, when user taps Mute/Volume button on AVPLayerViewController to unmute, the system restores the sound volume of device to the level when user muted before. On iOS 26, when user taps unmute button on screen, the volume starts from 0 (not restore). (but it still restores if user unmutes by pressing physical volume buttons). As I understand, the Volume bar/button on AVPlayerViewController is MPVolumeView, and I can not control it. So this is a feature of the system. But I got complaints that this is a bug. I did not find documents that describe this change of Mute button behavior. I need some bases to explain this situation. Thank you.
Replies
0
Boosts
1
Views
197
Activity
Oct ’25
Accessing External Timecode from Blackmagic ProDock in Custom App
Hi everyone, I’m exploring using the iPhone 17 Pro with the Blackmagic ProDock in a custom capture app. The genlock functionality seems accessible via AVExternalSyncDevice and related APIs, which is great. I’m specifically curious about external timecode coming in from the ProDock: • Is there a public way to access the timecode feed in a custom app via AVFoundation or another Apple API? • If so, what is the recommended approach to read or apply that timecode during capture? • Are there any current limitations or entitlements required to access timecode from ProDock in a third-party app? I’m excited to start integrating synchronized capture in my app, and any guidance or sample patterns would be greatly appreciated. Thanks in advance! — [Artem]
Replies
2
Boosts
0
Views
318
Activity
Oct ’25
CoreMediaErrorDomain -12888: Bandwidth down-stepping when using 2sec segment duration
the problem is when using HLS live stream with AVPlayer on iOS/ tvOS the player chooses first highest bandwidth then slowly steps down to lowest (within 1-3min) and eventually steps up again then repeats to step down. the AVPlayer error log sends events: errorStatusCode: -12888, errorDomain: Optional("CoreMediaErrorDomain"), errorComment: Optional("The operation couldn't be completed. (CoreMediaErrorDomain error -12888 - Playlist File unchanged for longer than 1.5 * target duration we use standard segments in CMAF format, 2sec duration #EXTM3U #EXT-X-VERSION:6 #EXT-X-TARGETDURATION:2 #EXT-X-MEDIA-SEQUENCE:147065903 #EXT-X-MAP:URI="video_1_4660000_init.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2" #EXT-X-PROGRAM-DATE-TIME:2025-04-30T12:51:07 #EXTINF:2.000, video_1_4660000_t17460174670001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174690001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 #EXTINF:2.000, video_1_4660000_t17460174710001555.mp4?device_profile=cmaf_cbcs_verimatrix_cei%26seg_size=2%26cmaf=2 when using 6sec segments the player stays stable at highest bandwidth. is there a way to avoid this error? in AVPlayer or HLS configuration?
Replies
2
Boosts
0
Views
395
Activity
May ’25
New API to control front camera orientation like native Camera app on iPhone 17 with iOS 26?
The iPhone 17’s front camera with the new 18MP square sensor and iOS 26’s Center Stage feature can auto-rotate between portrait and landscape like the native iOS Camera app. Is there a Swift or AVFoundation API that allows developers to manually control front camera orientation in the same way the native Camera app does? Or is this auto-rotation strictly handled by the system without public API access?
Replies
0
Boosts
1
Views
341
Activity
Oct ’25
Apple Music API no longer returns standalone singles as “single” albums
I’ve been using Apple Music API for quite a while now and a recent change must have happened which is quite disruptive. On many occasions, artists release singles from an album as part of promoting this album. For recent examples, Harry Styles released “Aperture” (a single) to promote his upcoming album “Kiss All The Time. Disco, Occasionally“. Similarly, Bruno Mars released “I Just Might”, a single from the upcoming album “The Romantic”. Previously, those would return at the endpoint ”artists/{artist_id}/albums” with a “- Single” suffix. But it seems a recent change happens where they only appear as playable tracks inside the album. This behavior is also evident in the Apple Music app itself. Those singles no longer appear under “Singles & EPs”. Instead, they would only be visible if the single becomes popular enough to be shown on “Top Songs“. Otherwise one would have to know to tap on the (future) album to discover if there are released singles. Meanwhile Spotify’s API returns those as singles properly, just like Apple Music API used to. This change must be recent but the question is if it’s intentional, and if so, how can the API be used from here on out to “extract” those singles and represent them?
Replies
0
Boosts
1
Views
307
Activity
Jan ’26
Process to request the restricted entitlement behind “DJ with Apple Music” (tempo control / time-stretch on Apple Music streams)?
Hi, I’m an iOS developer building an app with an use case that needs advanced playback on Apple Music subscription streams, specifically: • Real-time tempo change (BPM) during playback — i.e., time-stretch with key-lock, not just crossfade. • Beat-matched transitions between tracks. From what I can tell, this capability seems to exist only for approved partners and isn’t available through public MusicKit. Question: What’s the official request path to be evaluated for that restricted partner entitlement (application form, questionnaire, NDA, or internal team/BD contact)? If the entitlement identifier is internal, how can I get my account routed to the right Apple Music team? For reference, publicly announced partners include Algoriddim djay, Serato DJ Pro, rekordbox (AlphaTheta), and Engine DJ—all of which appear to implement mixing features that imply advanced playback (tempo/beat-matching) on Apple Music content. I’d prefer not to share product details publicly for the moment and can provide specifics privately if needed. Thanks in advance!
Replies
0
Boosts
1
Views
377
Activity
Oct ’25
Optimizing UICollectionView Scrolling Performance and High-Quality Image Loading with PHCachingImageManager
Hello, I'm developing an app that displays a photo library using UICollectionView and PHCachingImageManager. I'd like to achieve a user experience similar to the native iOS Photos app, where low-quality images are shown quickly while scrolling, and higher-quality images are loaded for visible cells once scrolling stops. I'm currently using the following approach: While Scrolling: I'm using the UICollectionViewDataSourcePrefetching protocol. In the prefetchItemsAt method, I call startCachingImages with low-quality options to cache images in advance. After Scrolling Stops: In the scrollViewDidEndDecelerating method, I intend to load high-quality images for the currently visible cells. I have a few questions regarding this approach: What is the best practice for managing both low-quality and high-quality images efficiently with PHCachingImageManager? Is it correct to call startCachingImages with fastFormat options and then call it again with highQualityFormat in scrollViewDidEndDecelerating? How can I minimize the delay when a low-quality image is replaced by a high-quality one? Are there any additional strategies to help pre-load high-quality images more effectively?
Replies
0
Boosts
1
Views
270
Activity
Aug ’25