Device: iPhone 17 Pro
iOS Version: iOS 26.1
Camera: Ultra-wide (0.5x) using AVCaptureSession
Our camera app freezes on iPhone 17 when switching frame rates (30fps ↔ 60fps). This works fine on iPhone 16 Pro and earlier.
What We've Observed:
Freeze happens on frame rate change - particularly when stabilization was enabled
Thread.sleep is used - to allow camera hardware to settle before re-enabling stabilization
Works on older iPhones - only iPhone 17 exhibits this behavior
Console shows these errors before freeze:
17281
<<<< FigXPCUtilities >>>> signalled err=18446744073709534335 <<<< FigCaptureSourceRemote >>>> err=-17281
Is Thread.sleep on the main thread causing the freeze? Should all camera configuration be on a background queue?
Is there something specific about iPhone 17 ultra-wide camera that requires different handling?
Should we use session.beginConfiguration() / session.commitConfiguration() instead of direct device configuration?
Is calling setFrameRate from a property's didSet (which runs synchronously) problematic?
Are the FigCaptureSourceRemote errors (-17281) indicative of the problem, and what do they mean?
Photos & Camera
RSS for tagExplore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hey,
There seems to be an inconsistency when capturing a photo using
QualityPrioritization.Quality on the iPhone 17 Pro Main wide Lens. If you zoom above "2x" the output image always has "-2.0ev" bias in the meta data and looks underexposued. This does not happen at zoom levels above 2, or if you set the QualityPrioritization to .Balanced.
See below:
with .Quality
with .Balanced
This does not happen on the other lenses.
I'm using a simple set up and it is consistent across JPEG and ProRAW capture. I have a demo project if that is useful.
Thanks,
Alex
PHPhotoLibrary.authorizationStatus(for: .readWrite) == .authorized
Iinfo.plist Privacy - Photo Library Usage Description set
I check authorization before attempting to get the photoPickerItem.itemIdentifier, but every time the return value from itemIdentifier is nil. Seems I missing some permissions, but unsure why the system is still keeping _shouldExposeItemIdentifier set to false.
Topic:
Media Technologies
SubTopic:
Photos & Camera
I am new to Swift and iOS development, and I have a question about video capture performance.
Is it possible to capture video at a resolution of 4032×3024 while simultaneously running a vision/ML model on the video stream (e.g., using Vision or CoreML)?
I want to know:
whether iOS devices support capturing video at that resolution,
whether the frame rate drops significantly at that scale,
and whether it is practical to run a Vision/ML model in real-time while recording at such a high resolution.
If anyone has experience with high-resolution AVCaptureSession setups or combining them with real-time ML processing, I would really appreciate guidance or sample code.
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application.
1-The assets have finished uploading, but I'm unable to retrieve successful records using PHAssetResourceUploadJob.fetchJobs(action: .acknowledge, options: nil). When will the successful records be returned?
2-How to retrieve the system's pending tasks ? We want to cancel tasks handed over to the system when returning to the main app to avoid duplicate resource uploads.
3-When we set UploadJobExtensionEnabled = true, will tasks handed over to the system still execute after returning to the main app? Do we need to set UploadJobExtensionEnabled = false upon returning to the main app? If we set UploadJobExtensionEnabled = false, will previously submitted upload tasks be cleared?
Topic:
Media Technologies
SubTopic:
Photos & Camera
Hello, We are trying to use the new Background Upload Extension to improve uploads of assets (Photos, Live Photos, Videos) in the background in our application.
1-The assets have finished uploading, but I'm unable to retrieve successful records using PHAssetResourceUploadJob.fetchJobs(action: .acknowledge, options: nil). When will the successful records be returned?
2-How to retrieve the system's pending tasks? We want to cancel tasks handed over to the system when returning to the main app to avoid duplicate resource uploads.
3-When we set UploadJobExtensionEnabled = true, will tasks handed over to the system still execute after returning to the main app? Do we need to set UploadJobExtensionEnabled = false upon returning to the main app? If we set UploadJobExtensionEnabled = false, will previously submitted upload tasks be cleared?
Topic:
Media Technologies
SubTopic:
Photos & Camera
I am developing an iOS camera app that can record video directly to external storage connected to an iPhone.
To detect whether an external USB storage device is connected and to obtain its URL, I am considering using AVExternalStorageDeviceDiscoverySession.
However, when checking support using AVExternalStorageDeviceDiscoverySession.isSupported, I observe that it returns true only on Pro model iPhones, and false on non-Pro models in my environment.
I have reviewed Apple’s official documentation, but I could not find any clear description of the supported devices or requirements (for example, whether this API is limited to Pro models or requires specific hardware capabilities).
I would appreciate any information regarding the following points:
●The actual requirements for AVExternalStorageDeviceDiscoverySession to be supported
Device limitations (Pro vs non-Pro models)
Hardware requirements (USB controller, external recording capability, etc.)
iOS version dependencies
●Whether support for non-Pro models is planned in the future
Tested environments
iPhone 16 Pro (iOS 18.7.1) → isSupported == true
iPhone 16e (iOS 26.2) → isSupported == false
iPhone 17 (iOS 26.2) → isSupported == false
iPhone Air (iOS 26.2) → isSupported == false
If anyone has observed similar behavior or has official information from Apple regarding this API, I would greatly appreciate your insights.
I am developing an iOS camera app that can record video directly to external storage connected to an iPhone.
To detect whether an external USB storage device is connected and to obtain its URL, I am considering using AVExternalStorageDeviceDiscoverySession.
However, when checking support using AVExternalStorageDeviceDiscoverySession.isSupported, I observe that it returns true only on Pro model iPhones, and false on non-Pro models in my environment.
I have reviewed Apple’s official documentation, but I could not find any clear description of the supported devices or requirements (for example, whether this API is limited to Pro models or requires specific hardware capabilities).
I would appreciate any information regarding the following points:
①The actual requirements for AVExternalStorageDeviceDiscoverySession to be supported
Device limitations (Pro vs non-Pro models)
Hardware requirements (USB controller, external recording capability, etc.)
iOS version dependencies
②Whether support for non-Pro models is planned in the future
Tested environments
iPhone 16 Pro (iOS 18.7.1) → isSupported == true
iPhone 16e (iOS 26.2) → isSupported == false
iPhone 17 (iOS 26.2) → isSupported == false
iPhone Air (iOS 26.2) → isSupported == false
If anyone has observed similar behavior or has official information from Apple regarding this API, I would greatly appreciate your insights.
Hi everyone,
I’m seeing recurring internal AVFoundation camera logs on iOS 26.2 and I’m trying to understand whether this is expected behavior or a regression in the capture pipeline.
These logs appear shortly after starting an AVCaptureSession, while video frames are being delivered, and also when the camera is stopped or the capture session is torn down.
<<<< FigXPCUtilities >>>> signalled err=-17281 at <>:302
<<<< FigCaptureSourceRemote >>>> Fig assert: "err == 0 " at bail (FigCaptureSourceRemote.m:569) - (err=-17281)
Even in this clean, minimal setup, the same logs appear on iOS 26.2
The exact same logic did not produce these logs on iOS 18.x.
To rule out issues caused by my own code, GPT created a minimal SwiftUI example from scratch.
My primary interest is to perform real-time processing on the video frames delivered by the camera (via AVCaptureVideoDataOutput), for tasks such as analysis, computer vision, or custom frame handling, while simultaneously displaying the live preview.
Thanks in advance for any insight.
Example Code
I'm adopting Liquid Glass in iOS 26, when I try to test VNDocumentCameraViewController with document scanning after Liquid Glass enabled, there's a crash just after a photo is taken in VNDocumentCameraViewController, here's the screenshot when it crashed
The exception output in XCode console is this:
*** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Layout requested for visible navigation bar, <UINavigationBar: 0x1240bde00; frame = (0 117; 390 54); opaque = NO; tintColor = UIExtendedSRGBColorSpace 1 1 0 1; layer = <CALayer: 0x120c21e60>> standardAppearance=0x12407b900 scrollEdgeAppearance=0x12407bb80 compactAppearance=0x12407b880 no-scroll-edge-support, when the top item belongs to a different navigation bar. topItem = <UINavigationItem: 0x1240bd800> style=navigator leftBarButtonItems=0x123d4e5f0 rightBarButtonItems=0x123d4d5a0, navigation bar = <UINavigationBar: 0x107b9ad00; frame = (0 47; 390 54); opaque = NO; autoresize = W; tintColor = UIExtendedSRGBColorSpace 1 1 0 1; layer = <CALayer: 0x120c20150>> delegate=0x10a805200 standardAppearance=0x107b2c300 scrollEdgeAppearance=0x107b2c280 compactAppearance=0x107b2c100, possibly from a client attempt to nest wrapped navigation controllers.'
*** First throw call stack:
(0x18e1db994 0x18b0f5814 0x18c092aa0 0x193b18660 0x193a7d540 0x193a7e020 0x1953ec4a0 0x1943b7d78 0x18ed83420 0x18ed82f74 0x18eb83134 0x18eb44c10 0x18eb70bc4 0x18eb7e74c 0x193ac8cd0 0x193ac8c04 0x193ad6afc 0x193ad5f8c 0x27b456560 0x18e12c4cc 0x18e15c0b0 0x18e15bfd8 0x18e133c1c 0x18e132a6c 0x22ed54498 0x193af6ba4 0x193a9fa78 0x193bcb68c 0x102cc2718 0x102cc2688 0x102cc2794 0x18b14ae28)
libc++abi: terminating due to uncaught exception of type NSException
I’m writing to report a serious usability regression in the iOS 26 Photos app. Folders can still be created and albums can still be assigned to them, but folders can no longer be opened to view the albums they contain. A container that cannot be opened is not a container, and this breaks a fundamental information architecture model that has existed in Photos for well over a decade.
This change disproportionately harms users who maintain large, intentional photo libraries—travel archives, projects, professional work, or long-term personal documentation—where hierarchy and ordering are essential. Search and automated surfacing are not substitutes for deliberate structure. Removing the ability to browse folder → album hierarchy on iOS strips users of control while still exposing the UI for folder creation, which is internally inconsistent.
If this behavior is intentional, it should be clearly documented and the folder UI removed to avoid misleading users. If it is not intentional, it needs urgent correction. At minimum, iOS should retain parity with macOS Photos for basic navigation of folders and albums. This is not a niche request; it is a regression in core functionality.
Topic:
Media Technologies
SubTopic:
Photos & Camera