Overview

Post

Replies

Boosts

Views

Activity

use `NEHotspotConfigurationManager.shared.apply(hotspotConfig)` to join a wifi slow on iphone17+
we use the api as NEHotspotConfigurationManager.shared.apply(hotspotConfig) to join a wifi, but we find that in in iphone 17+, some user report the time to join wifi is very slow the full code as let hotspotConfig = NEHotspotConfiguration(ssid: sSSID, passphrase: sPassword, isWEP: false) hotspotConfig.joinOnce = bJoinOnce if #available(iOS 13.0, *) { hotspotConfig.hidden = true } NEHotspotConfigurationManager.shared.apply(hotspotConfig) { [weak self] (error) in guard let self else { return } if let error = error { log.i("connectSSID Error while configuring WiFi: \(error.localizedDescription)") if error.localizedDescription.contains("already associated") { log.i("connectSSID Already connected to this WiFi.") result(["status": 0]) } else { result(["status": 0]) } } else { log.i("connectSSID Successfully connected to WiFi network \(sSSID)") result(["status": 1]) } } Normally it might only take 5-10 seconds, but on the iPhone 17+ it might take 20-30 seconds.
7
0
224
2w
For receiving audio in PushtoTalk, channelManager(_:didActivate:) not called when app receives first push after backgrounding
I'm implementing the PushToTalk framework and have encountered an issue where channelManager(_:didActivate:) is not called under specific circumstances. What works: App is in foreground, receives PTT push → didActivate is called ✅ App receives audio in foreground, then is backgrounded → subsequent pushes trigger didActivate ✅ What doesn't work: App is launched, user joins channel, then immediately backgrounds PTT push arrives while app is backgrounded incomingPushResult is called, I return .activeRemoteParticipant(participant) The system UI shows the speaker name correctly However, didActivate is never called Audio data arrives via WebSocket but cannot be played (no audio session) Setup: Channel joined successfully before backgrounding UIBackgroundModes includes push-to-talk No manual audio session activation (setActive) anywhere in my code AVAudioEngine setup only happens inside didActivate delegate method Issue persists even after channel restoration via channelDescriptor(restoredChannelUUID:) Question: Is this expected behavior or a bug? If expected, what's the correct approach to handle incoming PTT audio when the app is backgrounded and hasn't received audio while in the foreground yet?
6
0
153
2w
Cant Create/Update App Clips
Hey there, We've been using App Store Connect API to manage (create/update) Advanced App Clip Experiences via App Store Connect API. Everything has worked fine, we've been able to successfully manage hundreds of app clips but all of a sudden starting on December 15th the API started returning the following error: "id" => "1e15b36b-5347-4af0-9bab-7f6626ffec65" "status" => "409" "code" => "ENTITY_ERROR.INCLUDED.INVALID_ID" "title" => "The provided entity id is invalid" "detail" => "The provided included entity id 'EN' has invalid format" "source" => array:1 [▼ "pointer" => "/included/0/id" ] It does seem to be an API bug considering it has always worked fine and we didn't change anything on our side, the /included/0/id value has always been EN and never changed. Moreover, EN still seems to be a valid value according to the API docs and there are no changes reported of that field in the API release notes. Here's the request ID: 1e15b36b-5347-4af0-9bab-7f6626ffec65 I've tried using different values through trial and error (en, EN, EN-US, ...) and none of them worked.
6
4
435
3d
[Texas SB 2420] How to Retrieve Parental Consent Status
After reading the news below, we are currently working on updating our app in preparation for the enforcement of Texas SB 2420. https://developer.apple.com/news/?id=2ezb6jhj Based on the information in the announcement, we understand that parents will be able to revoke their consent for apps. However, we are unsure how an app is supposed to obtain or verify the parent’s consent status in the first place. We reviewed the Declared Age Range API and PermissionKit’s Significant Change API, but could not find any functionality related to this. If anyone with expertise on this topic has insight, we would greatly appreciate your guidance. Thank you in advance.
6
0
802
6d
Sheet background in share extension ignores Liquid Glass effect in iOS 26/Xcode 26
I’m developing a share extension for iOS 26 with Xcode 26. When the extension’s sheet appears, it always shows a full white background, even though iOS 26 introduces a new “Liquid Glass” effect for partial sheets. Expected: The sheet background should use the iOS 26 glassmorphism effect as seen in full apps. Actual behavior: Custom sheets in my app get the glass effect, but the native system sheet in the share extension always opens as plain white. Steps to reproduce: Create a share extension using UIKit Present any UIViewController as the main view Set modalPresentationStyle = .pageSheet (or leave as default) Observe solid white background, not glassmorphism Sample code: swift override func viewDidLoad() { super.viewDidLoad() view.backgroundColor = .clear preferredContentSize = CGSize(width: UIScreen.main.bounds.width, height: 300) } Troubleshooting attempted: Tried adding UIVisualEffectView with system blur/materials Removed all custom backgrounds Set modalPresentationStyle explicitly Questions: Is it possible to enable or force the Liquid Glass effect in share extensions on iOS 26? Is this a limitation by design or a potential bug? Any workaround to make extension sheet backgrounds match system glass appearance?
6
1
412
2d
Incorrect system color on popover view, and does not update while switching dark mode on iOS 26 beta 3
All system colors are displayed incorrectly on the popover view. Those are the same views present as a popover in light and dark mode. And those are the same views present as modal. And there is also a problem that when the popover is presented, switching to dark/light mode will not change the appearance. That affected all system apps. The following screenshot is already in dark mode. All those problem are occured on iOS 26 beta 3.
6
0
710
3w
My Notifications Message Extension doesn't seem to run after distributing my app via Enterprise IPA
I'm developing an app that receives push notifications, and writes the contents of the push notification to a shared location between the main app and a Notifications Message Extension, through App Groups. This all seems to work on my phone, with developer mode turned on, but when I archive my app as an Enterprise IPA and distribute it, the users can install the app on their phones and they receive the push notifications, but it doesn't appear that the message extension is running as my app displays the content of the shared data in the App Groups on the main screen and nothing is showing. I have tried on 3 phones, and it only works on the phone with developer mode turned on. I can't tell at this point whether it's because of a signing issue, or build phase order issue, or something else?
6
0
270
2w
PSVR2 controller button quirks
I have an open Feedback conversation with Apple on this topic, but I am curious if others have run into this, or want to try out my sample code in their set up. there are two API’s for reading controller buttons, axis, and D pads: GCPhysicalInputProfile and GCControllerLiveInput. There are inconsistencies in behaviour between the two of them. Apple recommends we use GCControllerLiveInput, however, there are some capabilities on these controllers that are only accessible through GCPhysicalInputProfile, as I’ll discuss below. PSVR2 R2/L2 buttons, a.k.a. triggers, have force input analogue values. These can only be accessed on GCPhysicalInputProfile PSVR2 thumbstick direction values are read through “axes” on GCPhysicalInputProfile, but only “dpads” on GCControllerLiveInput on both GCPhysicalInputProfile and GCControllerLiveInput, All pressed events of all buttons are fired properly using generic aliases ( Trigger, Grip ,Menu, Right Thumbstick, Left Thumbstick, Right Button A & B (Circle & Cross), Left Button A&B (Triangle and Square) ). Apple reserves the system button as the equivalent of a home button for the OS. on GCPhysicalInputProfile, touch events are fired when the button is also pressed, but not for only touches. on GCControllerLiveInput , Touch events only works for the following buttons: Left Thumbstick, Right Thumbstick, Right Button A (Circle), and Right Button B (Cross). But Right Button B touch event isn’t labelled correctly, it fires as the Right Button A event. I observed this inside ALVR which uses a polling based approach to event processing: https://github.com/alvr-org/alvr-visionos/blob/17b5968f9d894944b53e97134b39dfce0993302a/ALVRClient/WorldTracker.swift#L301 To simplify to see this on a very simple app, I used the Apple example TrackingAccessories application: https://developer.apple.com/documentation/ARKit/tracking-accessories-in-volumetric-windows I’ve attached the code that replaces the AccessoryTrackingModel class. I added code that prints out what is touched/pressed, see the trackAllConnectedSpatialControllers method: https://github.com/svrc/TrackingAccessories
6
0
761
2w
[iOS 26 Beta] BGTaskScheduler.supportedResources incorrectly reports no GPU support for BGContinuedProcessingTask on capable hardware
Testing Environment: iOS: 26.0 Beta 7 Xcode: Beta 6 Description: We are implementing the new BGContinuedProcessingTask API introduced in iOS 26. We have followed the official documentation and WWDC session guidance to configure our project. The Background Modes (processing) and Background GPU Access capabilities have been added in Xcode. The com.apple.developer.background-tasks.continued-processing.gpu entitlement is present and set to in the .entitlements file. The provisioning profile details viewed within Xcode explicitly show that the "Background GPU Access" capability and the corresponding entitlement are included. Despite this correct configuration, when running the app on supported hardware (iPhone 16 Pro), a call to BGTaskScheduler.supportedResources.contains(.gpu) consistently returns false. This prevents us from setting request.requiredResources = .gpu. As a result, when the BGContinuedProcessingTask starts without the GPU resource flag, our internal Metal-based exporter attempts to access the GPU and is terminated by the system, throwing an IOGPUMetalError: Insufficient Permission (to submit GPU work from background). We have performed extensive debugging, including a full reset of the provisioning profile (removing/re-adding capabilities, toggling automatic signing, cleaning build folders, and reinstalling the app), but the issue persists. This strongly suggests a bug in the iOS 26 beta where the runtime is failing to correctly validate a valid entitlement. Additionally, we've observed inconsistent behavior across devices. On an A16-based iPad, the task submits successfully (BGTaskScheduler.submit does not throw an error), but the launch handler is never invoked by the system. On the iPhone 16 Pro, the handler is invoked, but we encounter the supportedResources issue described above. This leads us to ask for clarification on the exact hardware requirements for this feature. We hypothesize that it may be limited to devices that support Apple Intelligence (A17 Pro and newer). Could you please confirm this and provide official documentation on the device support criteria? Steps to Reproduce: Create a new Xcode project. In Signing & Capabilities, add "Background Modes" (with "Background processing" checked) and "Background GPU Access". Add a permitted identifier (e.g., "com.company.test.*") to BGTaskSchedulerPermittedIdentifiers in Info.plist. In application(_:didFinishLaunchingWithOptions:) or a ViewController's viewDidLoad, log the result of BGTaskScheduler.shared.supportedResources.contains(.gpu). Build and run on a physical, supported device (e.g., iPhone 16 Pro). Expected Results: The log should indicate that BGTaskScheduler.shared.supportedResources.contains(.gpu) returns true. Actual Results: The log shows that BGTaskScheduler.shared.supportedResources.contains(.gpu) returns false.
6
0
366
2w
iOS App rejected
Guideline 2.5.1 - Performance - Software Requirements The app uses or references the following non-public or deprecated APIs: Iobmobile Classes: • __SwiftValue The use of non-public or deprecated APIs is not permitted, as they can lead to a poor user experience should these APIs change and are otherwise not supported on Apple platforms. Can anyone she some light as to what __SwiftValue even means?
6
0
269
6d
App Clip invocation fails with "ASDErrorDomain error 507" via Smart App Banner especially on iOS 26 devices.
Hello, We are encountering an issue where invoking our App Clip via a Safari Smart App Banner fails on certain devices, particularly those running iOS 26. When a user taps "Open" on the Smart App Banner, the App Clip card attempts to load but ultimately fails with ASDErrorDomain Error 507. The error occurs consistently on specific devices, while other devices function correctly. In some instances, the App Clip card metadata/UI appears momentarily (flashes on the screen) before the error message is displayed and the process terminates. Has anyone else experienced this specific ASDErrorDomain error? We have already submitted a report via Feedback Assistant, but any insights or workarounds from the community would be appreciated. Thanks!
6
0
224
3d
How to optimize my app for for a carrier-provided satellite network?
Hello, I am working to integrate the new com.apple.developer.networking.carrier-constrained.app-optimized entitlement in my iOS 26 app so that my app can use a carrier-provided satellite network, and want to confirm my understanding of how to detect and optimize for satellite network conditions. (Ref: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.developer.networking.carrier-constrained.app-optimized ) My current approach: I plan to set the entitlement to true once my app is optimized for satellite networks. To detect if the device is connected to a satellite network, I intend to use the Network framework’s NWPath properties: isUltraConstrained — I understand this should be set to true when the device is connected to a satellite network. (Ref: https://developer.apple.com/documentation/network/nwpath/isultraconstrained ) linkQuality == .minimal — I believe this will also be set in satellite scenarios, though it may not be exclusive to satellite connections. (Ref: https://developer.apple.com/documentation/network/nwpath/linkquality-swift.enum/minimal ) Questions: Is it correct that isUltraConstrained will reliably indicate a satellite connection? Should I also check for linkQuality == .minimal, or is isUltraConstrained sufficient? Are there any additional APIs or best practices for detecting and optimizing for satellite connectivity that I should be aware of? Thank you for confirming whether my understanding and approach are correct, and for any additional guidance.
6
0
567
2w
My app doesn't respond on iPhone Air iOS 26.1.
My app doesn't respond on iPhone Air iOS 26.1. After startup, my app shows the main view with a tab bar controller containing 4 navigation controllers. However, when a second-level view controller is pushed onto any navigation controller, the UI freezes and becomes unresponsive. The iPhone simulator running iOS 26.1 exhibits the same problem. The debug profile shows CPU usage at 100%. However, other devices and simulators do not have this problem.
6
3
331
2w
Wi-Fi Raw Socket Disconnection Issue on iPhone 17 Series
On my iPhone 16 Pro and iPhone 16 Pro Max devices, running iOS 26.0, 26.0.1, and 26.1, Wi-Fi raw socket communication works flawlessly. Even after keeping the connection active for over 40 minutes, there are no disconnections during data transmission. However, on the iPhone 17 and iPhone 17 Pro, the raw socket connection drops within 20 seconds. Once it disconnects, the socket cannot reconnect unless the Wi-Fi module itself is reset. I believe this issue is caused by a bug in the iPhone 17 series’ communication module. I have looked into many cases, and it appears to be related to a bug in the N1 chipset. Are there any possible solutions or workarounds for this issue?
6
1
253
3w
Tap to Pay Entitlement only for development
Hi, We applied for Tap to Pay on iPhone entitlement and were approved, but on distribution support it's only showing Development. We can build and debug Tap to Pay on development, but unable to build release. We opened ticket with Apple support but they were saying it was configured correctly. I attached screenshot of our developer account entitlement for Tap to Pay. It clearly said Development only.
6
1
2.2k
2w
Multiple Apple Pay relationships with differing apple-developer-merchantid-domain-association files
I've encountered an issue where we need multiple domain associations with separate Apple Pay implementations. Briefly, we have a /.well-known/apple-developer-merchantid-domain-association already setup with Stripe, and now we need another, different version of the file to get setup with FreedomPay. FreedomPay insists this file represents a three-way relationship between all parties and I have no reason to disbelieve them. I'm wondering if anyone has encountered this or if there is a standard procedure. I'm currently trying to find documentation on the exact way Apple Pay verification interacts with this file to see if we can produce it dynamically.
6
0
3.9k
2w
UITabBar Appears During Swipe-Back Gesture on iOS 26 Liquid Glass UI
Hello, While integrating the Liquid Glass UI introduced in iOS 26 into my existing app, I encountered an unexpected issue. My app uses a UITabBarController, where each tab contains a UINavigationController, and the actual content resides in each UIViewController. Typically, I perform navigation using navigationController?.pushViewController(...) and hide the TabBar by setting vc.hidesBottomBarWhenPushed = true when needed. This structure worked perfectly fine prior to iOS 26, and I believe many apps use a similar approach. However, after enabling Liquid Glass UI, a problem occurs. Problem Description From AViewController, I push BViewController with hidesBottomBarWhenPushed = true. BViewController appears, and the TabBar is hidden as expected. When performing a swipe-back gesture, as soon as AViewController becomes visible, the TabBar immediately reappears (likely due to A’s viewWillAppear). The TabBar remains visible for a short moment even if the gesture is canceled — during that time, it is also interactable. Before iOS 26, the TabBar appeared synchronized with AViewController and did not prematurely show during the swipe transition. Tried using the new iOS 18 API: tabBarController?.setTabBarHidden(false, animated: true) It slightly improves the animation behavior, but the issue persists. If hidesBottomBarWhenPushed is now deprecated or discouraged, migrating entirely to setTabBarHidden would require significant refactoring, which is not practical for many existing apps. Is this caused by a misuse of hidesBottomBarWhenPushed, or could this be a regression or design change in iOS 26’s Liquid Glass UI?
Topic: UI Frameworks SubTopic: UIKit Tags:
6
3
810
3w
Package created with pkgbuild installs zero-byte file
Just recently, any pkg file that I create with pkgbuild will install the Payload's application as a zero-byte file in the /Applications directory. This has been working for years without issue for me. Here are the commands I am using with company specific items replaced: pkgbuild --analyze --root MyApplicationRootDirectory standalone.plist plutil -replace BundleIsRelocatable -bool NO standalone.plist pkgbuild --identifier MyIdentifier --version 1.0 --install-location /Applications --root MyApplicationRootDirectory --component-plist standalone.plist --sign 'Developer ID Installer: MyCompany (MySignId)' --timestamp installer.pkg Any ideas on what could be causing the issue? I have verified the following: The application being added to the pkg is both signed and notarized using the correct Developer ID Application certificate. The resultant pkg file is both signed and notarized using the Developer ID Installer certificate. Verified the pkg contents using "pkgutil --expand" to dump the contents. Verified the pkg's Payload contents by extracting the data using "cat Payload | gunzip | cpio -1". This results in an application file that is a binary match for file added in the "pkgbuild --root" argument. My application is the only file within the directory passed to the "pkgbuild --root" argument. There are no warnings in the System Settings / Privacy & Security Panel when running the package installer. I have a valid Mac Developer account. I am building the application and the pkg file on the same computer. Thank you for any insight.
6
0
223
3w