Overview

Post

Replies

Boosts

Views

Activity

Error resuming background audio while connected to CarPlay
My app utilizes background audio to play music files. I have the audio background mode enabled and I initialize the AVAudioSession in playback mode with the mixWithOthers option. And it usually works great while the app is backgrounded. I listen for audio interruptions as well as route changes and I am able to handle them appropriately and I can usually resume my background audio no problem. I discovered an issue while connected to CarPlay though. Roughly 50% of the time when I disconnect from a phone call while connected to CarPlay I get the following error after calling the play() method of my AVAudioPlayer instance: "ATAudioSessionClientImpl.mm:281 activation failed. status = 561015905" If I instead try to start a new audio session I get a similar error: Error Domain=NSOSStatusErrorDomain Code=561015905 "Session activation failed" UserInfo={NSLocalizedDescription=Session activation failed} Like I said, this isn't reproducible 100% of the time and is so far only seen while connected to CarPlay. I don't think Im forgetting so additional capability or plist setting, but if anyone has any clues it would be greatly appreciated. Otherwise this is likely just a bug that I need to report to Apple. One very important note, and reason I believe it's just a bug, is that while I was testing I found that other music apps like Spotify will also fail to resume their audio at the same time my app fails. Another important detail is that when it works successfully I receive the audio session interruption ended notification, and when it doesn't work I only receive a route configuration change or route override notification. From there I am able to still successfully granted background time to execute code, but my call to resume audio fails with the above mentioned error codes.
0
0
208
2w
Incremental build not working on Mac mini CI (Xcode 26) — always full rebuild
Hi everyone, We run our CI builds on a Mac mini. On my local MacBook, incremental builds work properly and build times are fast, but on CI it looks like Xcode rebuilds everything from scratch every time, and the build is about 6 minutes slower than local. In Xcode 26, “compilation caching” became available. My understanding was that if DerivedData is preserved between CI runs, compilation caching / incremental builds should reduce build time. So I tried specifying the DerivedData path as an absolute path in our xcodebuild command to reuse it across builds. However, it still seems to do a full rebuild every time and the build time didn’t improve. Has anyone seen a similar issue with incremental builds on CI (Mac mini) with Xcode 26? Any advice on what to check or how to configure CI so incremental builds / caching actually work would be greatly appreciated. Thanks!
0
0
48
2w
Pickers in toolbar expand its width
With iOS 26 there has been a change in behavior with Pickers in the toolbar. The Picker looks expanded unlike other views such as a Button and Menu. See screenshots below. Is this the intended behavior or a bug? (I already submitted a feedback for this at FB19276474) What Picker looks like in the toolbar: What Button looks like in the toolbar:
0
0
58
2w
Network Interface APIs
For important background information, read Extra-ordinary Networking before reading this. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = "eskimo" + "1" + "@" + "apple.com" Network Interface APIs Most developers don’t need to interact directly with network interfaces. If you do, read this post for a summary of the APIs available to you. Before you read this, read Network Interface Concepts. Interface List The standard way to get a list of interfaces and their addresses is getifaddrs. To learn more about this API, see its man page. A network interface has four fundamental attributes: A set of flags — These are packed into a CUnsignedInt. The flags bits are declared in <net/if.h>, starting with IFF_UP. An interface type — See Network Interface Type, below. An interface index — Valid indexes are greater than 0. A BSD interface name. For example, an Ethernet interface might be called en0. The interface name is shared between multiple network interfaces running over a given hardware interface. For example, IPv4 and IPv6 running over that Ethernet interface will both have the name en0. WARNING BSD interface names are not considered API. There’s no guarantee, for example, that an iPhone’s Wi-Fi interface is en0. You can map between the last two using if_indextoname and if_nametoindex. See the if_indextoname man page for details. An interface may also have address information. If present, this always includes the interface address (ifa_addr) and the network mask (ifa_netmask). In addition: Broadcast-capable interfaces (IFF_BROADCAST) have a broadcast address (ifa_broadaddr, which is an alias for ifa_dstaddr). Point-to-point interfaces (IFF_POINTOPOINT) have a destination address (ifa_dstaddr). Calling getifaddrs from Swift is a bit tricky. For an example of this, see QSocket: Interfaces. IP Address List Once you have getifaddrs working, it’s relatively easy to manipulate the results to build a list of just IP addresses, a list of IP addresses for each interface, and so on. QSocket: Interfaces has some Swift snippets that show this. Interface List Updates The interface list can change over time. Hardware interfaces can be added and removed, network interfaces come up and go down, and their addresses can change. It’s best to avoid caching information from getifaddrs. If thats unavoidable, use the kNotifySCNetworkChange Darwin notification to update your cache. For information about registering for Darwin notifications, see the notify man page (in section 3). This notification just tells you that something has changed. It’s up to you to fetch the new interface list and adjust your cache accordingly. You’ll find that this notification is sometimes posted numerous times in rapid succession. To avoid unnecessary thrashing, debounce it. While the Darwin notification API is easy to call from Swift, Swift does not import kNotifySCNetworkChange. To fix that, define that value yourself, calling a C function to get the value: var kNotifySCNetworkChange: UnsafePointer<CChar> { networkChangeNotifyKey() } Here’s what that C function looks like: extern const char * networkChangeNotifyKey(void) { return kNotifySCNetworkChange; } Network Interface Type There are two ways to think about a network interface’s type. Historically there were a wide variety of weird and wonderful types of network interfaces. The following code gets this legacy value for a specific BSD interface name: func legacyTypeForInterfaceNamed(_ name: String) -> UInt8? { var addrList: UnsafeMutablePointer<ifaddrs>? = nil let err = getifaddrs(&addrList) // In theory we could check `errno` here but, honestly, what are gonna // do with that info? guard err >= 0, let first = addrList else { return nil } defer { freeifaddrs(addrList) } return sequence(first: first, next: { $0.pointee.ifa_next }) .compactMap { addr in guard let nameC = addr.pointee.ifa_name, name == String(cString: nameC), let sa = addr.pointee.ifa_addr, sa.pointee.sa_family == AF_LINK, let data = addr.pointee.ifa_data else { return nil } return data.assumingMemoryBound(to: if_data.self).pointee.ifi_type } .first } The values are defined in <net/if_types.h>, starting with IFT_OTHER. However, this value is rarely useful because many interfaces ‘look like’ Ethernet and thus have a type of IFT_ETHER. Network framework has the concept of an interface’s functional type. This is an indication of how the interface fits into the system. There are two ways to get an interface’s functional type: If you’re using Network framework and have an NWInterface value, get the type property. If not, call ioctl with a SIOCGIFFUNCTIONALTYPE request. The return values are defined in <net/if.h>, starting with IFRTYPE_FUNCTIONAL_UNKNOWN. Swift does not import SIOCGIFFUNCTIONALTYPE, so it’s best to write this code in a C: extern uint32_t functionalTypeForInterfaceNamed(const char * name) { int fd = socket(AF_INET, SOCK_DGRAM, 0); if (fd < 0) { return IFRTYPE_FUNCTIONAL_UNKNOWN; } struct ifreq ifr = {}; strlcpy(ifr.ifr_name, name, sizeof(ifr.ifr_name)); bool success = ioctl(fd, SIOCGIFFUNCTIONALTYPE, &ifr) >= 0; int junk = close(fd); assert(junk == 0); if ( ! success ) { return IFRTYPE_FUNCTIONAL_UNKNOWN; } return ifr.ifr_ifru.ifru_functional_type; } Finally, TN3158 Resolving Xcode 15 device connection issues documents the SIOCGIFDIRECTLINK flag as a specific way to identify the network interfaces uses by Xcode for device connection traffic. Revision History 2025-12-10 Added info about SIOCGIFDIRECTLINK. 2023-07-19 First posted.
0
0
2k
3w
Vision face landmarks shifted on iOS 26 but correct on iOS 18 with same code and image
I'm using Vision framework (DetectFaceLandmarksRequest) with the same code and the same test image to detect face landmarks. On iOS 18 everything works as expected: detected face landmarks align with the face correctly. But when I run the same code on devices with iOS 26, the landmark coordinates are outside the [0,1] range, which indicates they are out of face bounds. Fun fact: the old VNDetectFaceLandmarksRequest API works very well without encountering this issue How I get face landmarks: private let faceRectangleRequest = DetectFaceRectanglesRequest(.revision3) private var faceLandmarksRequest = DetectFaceLandmarksRequest(.revision3) func detectFaces(in ciImage: CIImage) async throws -> FaceTrackingResult { let faces = try await faceRectangleRequest.perform(on: ciImage) faceLandmarksRequest.inputFaceObservations = faces let landmarksResults = try await faceLandmarksRequest.perform(on: ciImage) ... } How I show face landmarks in SwiftUI View: private func convert( point: NormalizedPoint, faceBoundingBox: NormalizedRect, imageSize: CGSize ) -> CGPoint { let point = point.toImageCoordinates( from: faceBoundingBox, imageSize: imageSize, origin: .upperLeft ) return point } At the same time, it works as expected and gives me the correct results: region is FaceObservation.Landmarks2D.Region let points: [CGPoint] = region.pointsInImageCoordinates( imageSize, origin: .upperLeft ) After that, I found that the landmarks are normalized relative to the unalignedBoundingBox. However, I can’t access it in code. Still, using these values for the bounding box works correctly. Things I've already tried: Same image input Tested multiple devices on iOS 26.2 -> always wrong. Tested multiple devices on iOS 18.7.1 -> always correct. Environment: macOS 26.2 Xcode 26.2 (17C52) Real devices, not simulator Face Landmarks iOS 18 Face Landmarks iOS 26
0
0
139
2w
How to handle Sign in with Apple Server to server Notifications?
Hello. When a user revokes Apple Login authorization, I am expecting a webhook to be delivered to our configured endpoint, but I currently not receiving any at all. So I have some questions: Should the revoke event webhook be delivered in real-time? If it is not real-time, when is the webhook supposed to be sent? If my server fails to respond to the webhook request, does Apple retry the delivery? (Actually I couldn't find how to response in this scenario, but if I can) Thanks in advance.
0
0
71
3w
AVAudioEngine Voice Processing Fails with Mismatched Input/Output Devices: AggregateDevice Channel Count Mismatch
I'm encountering errors while using AVAudioEngine with voice processing enabled (setVoiceProcessingEnabled(true)) in scenarios where the input and output audio devices are not the same. This issue arises specifically with mismatched devices, preventing the application from functioning as expected. Works: Paired devices (e.g., MacBook Pro mic → MacBook Pro speakers) Fails: Mismatched devices (e.g., AirPods mic → MacBook Pro speakers) When using paired input and output devices: The setup works as expected. Example: MacBook Pro microphone → MacBook Pro speakers. When using mismatched devices: AVAudioEngine setup fails during aggregate device construction. Example: AirPods microphone → MacBook Pro speakers. Error logs indicate a channel count mismatch. Here are the partial logs. Due to the content limit, I cannot post the entire logs. AUVPAggregate.cpp:1000 client-side input and output formats do not match (err=-10875) AUVPAggregate.cpp:1036 err=-10875 AVAEInternal.h:109 [AVAudioEngineGraph.mm:1344:Initialize: (err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)): error -10875 AggregateDevice.mm:329 Failed expectation of constructed aggregate (312): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (312): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) AggregateDevice.mm:182 error fetching default pair AggregateDevice.mm:329 Failed expectation of constructed aggregate (336): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (336): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) AUHAL.cpp:1782 ca_verify_noerr: [AudioDeviceSetProperty(mDeviceID, NULL, 0, isInput, kAudioDevicePropertyIOProcStreamUsage, theSize, theStreamUsage), 560227702] AudioHardware-mac-imp.cpp:3484 AudioDeviceSetProperty: no device with given ID AUHAL.cpp:1782 ca_verify_noerr: [AudioDeviceSetProperty(mDeviceID, NULL, 0, isInput, kAudioDevicePropertyIOProcStreamUsage, theSize, theStreamUsage), 560227702] AggregateDevice.mm:182 error fetching default pair AggregateDevice.mm:329 Failed expectation of constructed aggregate (348): mInput.streamChannelCounts == inputStreamChannelCounts AggregateDevice.mm:331 Failed expectation of constructed aggregate (348): mInput.totalChannelCount == std::accumulate(inputStreamChannelCounts.begin(), inputStreamChannelCounts.end(), 0U) Is it possible to use voice processing with different input/output devices? If yes, are there any specific configurations required to handle mismatched devices? How can we resolve channel count mismatch errors during aggregate device construction? Are there settings or API adjustments to enforce compatibility between input/output devices? Are there any workarounds or alternative approaches to achieve voice processing functionality with mismatched devices? For instance, can we force an intermediate channel configuration or downmix input/output formats?
0
0
194
2w
WKWebview returns an 'unsupported type' error on every single function after first success while other webview works flawlessly.
I'm working on a SwiftUI application that uses a couple different webviews in a tabview to render some mdx and codemirror editor. The editor webview, the one that's much more complicated works as expected. Some errors appear in the console due to what I imagine is a race condition that I'll get around to fixing, but it works as expected. The other webview which just renders a single, local html file to display a dead simple summary absolutely refuses to work. It first appears to work as expected (shows the same 'return type unsupported' error in the console though) on the initial request, but then refuses to process any JS functions for that particular webview. Even the functions themselves are being used between the two webviews, and they work as expected in the other. Even worse, when I copy and paste the generated JS code into the safari dev tools it works as expected, even in the broken webview. I've spent almost 12 hours on this today so far, and have made zero progress. I've tried commenting out just about the entire website to narrow it down on the JS side without success, and I've done everything I can think of on the swift side. To be transparent, I'm very new to Swift and SwiftUI, having only picked it up a few weeks ago, but I'm an experienced developer and every obvious solution fails to work. From what I've gathered, this might have something to do with the first function call failing, despite the fact that it appears to work on the first function call, but then the javascript engine refusing to process additional requests. I'm not sure if that's the cause, but it certainly seems to make sense. Is there a way to debug this more completely? Like I said, I'm very new to Swift and still missing neovim, so I'm still getting comfortable with the apple ecosystem of devtools, but I can't even figure out how to print out the return type since it fails before I'm able to inspect anything on the safari side I did notice this error in the console as well, and I'm not quite sure what to make of it: Error acquiring assertion: <Error Domain=RBSServiceErrorDomain Code=1 "((target is not running or doesn't have entitlement com.apple.developer.web-browser-engine.rendering AND target is not running or doesn't have entitlement com.apple.developer.web-browser-engine.networking AND target is not running or doesn't have entitlement com.apple.developer.web-browser-engine.webcontent))" UserInfo={NSLocalizedFailureReason=((target is not running or doesn't have entitlement com.apple.developer.web-browser-engine.rendering AND target is not running or doesn't have entitlement com.apple.developer.web-browser-engine.networking AND target is not running or doesn't have entitlement com.apple.developer.web-browser-engine.webcontent))}> And as I was digging for that one, I just noticed this: WebContent[32934] 0x102011208 - [webPageID=306] WebPage::runJavaScriptInFrameInScriptWorld: Request to run JavaScript failed with error SecurityError: The operation is insecure. What would trigger a security warning for running javascript against a local file? Any help is greatly appreciated... this is driving me crazy.
0
1
267
3w
A Peek Behind the NECP Curtain
From time to time the subject of NECP grows up, both here on DevForums and in DTS cases. I’ve posted about this before but I wanted to collect those tidbits into single coherent post. If you have questions or comments, start a new thread in the App & System Services > Networking subtopic and tag it with Network Extension. That way I’ll be sure to see it go by. Share and Enjoy — Quinn “The Eskimo!” @ Developer Technical Support @ Apple let myEmail = "eskimo" + "1" + "@" + "apple.com" A Peek Behind the NECP Curtain NECP stands for Network Extension Control Protocol. It’s a subsystem within the Apple networking stack that controls which programs have access to which network interfaces. It’s vitally important to the Network Extension subsystem, hence the name, but it’s used in many different places. Indeed, a very familiar example of its use is the Settings > Mobile Data [1] user interface on iOS. NECP has no explicit API, although there are APIs that are offer some insight into its state. Continuing the Settings > Mobile Data example above, there is a little-known API, CTCellularData in the Core Telephony framework, that returns whether your app has access to WWAN. Despite having no API, NECP is still relevant to developers. The Settings > Mobile Data example is one place where it affects app developers but it’s most important for Network Extension (NE) developers. A key use case for NECP is to prevent VPN loops. When starting an NE provider, the system configures the NECP policy for the NE provider’s process to prevent it from using a VPN interface. This means that you can safely open a network connection inside your VPN provider without having to worry about its traffic being accidentally routed back to you. This is why, for example, an NE packet tunnel provider can use any networking API it wants, including BSD Sockets, to run its connection without fear of creating a VPN loop [1]. One place that NECP shows up regularly is the system log. Next time you see a system log entry like this: type: debug time: 15:02:54.817903+0000 process: Mail subsystem: com.apple.network category: connection message: nw_protocol_socket_set_necp_attributes [C723.1.1:1] setsockopt 39 SO_NECP_ATTRIBUTES … you’ll at least know what the necp means (-: Finally, a lot of NECP infrastructure is in the Darwin open source. As with all things in Darwin, it’s fine to poke around and see how your favourite feature works, but do not incorporate any information you find into your product. Stuff you uncover by looking in Darwin is not considered API. [1] Settings > Cellular Data if you speak American (-: [2] Network Extension providers can call the createTCPConnection(to:enableTLS:tlsParameters:delegate:) method to create an NWTCPConnection [3] that doesn’t run through the tunnel. You can use that if it’s convenient but you don’t need to use it. [3] NWTCPConnection is now deprecated, but there are non-deprecated equivalents. For the full story, see NWEndpoint History and Advice. Revision History 2025-12-12 Replaced “macOS networking stack” with “Apple networking stack” to avoid giving the impression that this is all about macOS. Added a link to NWEndpoint History and Advice. Made other minor editorial changes. 2023-02-27 First posted.
0
0
2.4k
2w
Even though I added EULA to my application, it gets rejected for the same reason.
Hello, I have a problem regarding my App review. The issue: "Guideline 3.1.2 - Business - Payments - Subscriptions Issue Description The submission did not include all the required information for apps offering auto-renewable subscriptions. The app's metadata is missing the following required information: A functional link to the Terms of Use (EULA). If you are using the standard Apple Terms of Use (EULA), include a link to the Terms of Use in the App Description. If you are using a custom EULA, add it in App Store Connect." My explanation: I'm using Apple's original EULA in my app. The EULA was already in the "About" section. It redirects to the EULA page on Apple's official website. However, since my app also has a payment screen, I added the same information below the payment screen and submitted it for review again. Unfortunately, it's still being rejected for the same reason. I've requested a full, detailed explanation via email, but they keep sending me the same copy-and-paste message. I want to resolve this issue as soon as possible, so I decided to ask you here. Please check the screenshots and let me know what's the problem. Best regards, DryreL
0
0
90
3w