I am experiencing an issue where XCode reverts .xccurrentversion file in my iOS app to the first version whenever xcodebuild is run or whenever XCode is started. This means I can build the app and run tests in XCode if I discard the reversion .xccurrentversion on XCode start. However, testing on CI is impossible because the version the tests rely on are reverted whenever xcodebuild is run.
The commands I run to reproduce the issue
❯ git status
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: Path/.xccurrentversion
no changes added to commit (use "git add" and/or "git commit -a")
❯ git checkout "Path/.xccurrentversion"
Updated 1 path from the index
❯ git status
nothing to commit, working tree clean
❯ xcodebuild \
-scheme Scheme \
-configuration Configuration \
-sdk iphonesimulator \
-destination 'platform=iOS Simulator,name=iPhone 16 Pro,OS=latest' \
-skipPackagePluginValidation \
-skipMacroValidation \
test > /dev/null # test fails because model version is reverted
❯ git status
HEAD detached at pull/249/merge
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: Path/.xccurrentversion
no changes added to commit (use "git add" and/or "git commit -a")
I have experienced such issue in 16.3 (16E140) and 16.2 (16C5032a).
Similar issues/solutions I have found online are the following. But they are either not relevant or do not work in my case.
https://stackoverflow.com/questions/17631587/xcode-modifies-current-coredata-model-version-at-every-launch
https://github.com/CocoaPods/Xcodeproj/issues/81
Is anyone aware of any solution? Is there a recommended way I can run diagnostics on XCode and file a feedback?
Overview
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
View Layout
Add the following views in a view controller:
Label
View A, with a subview of the same size: MTKView A
View B, with a subview of the same size: MTKView B
Refresh Rates of Each View
The label view refreshes at 60fps (driven by CADisplayLink).
MTKView A and B refresh at 15fps.
MTKView Implementation Details
The corresponding CAMetalLayer's maximumDrawableCount is set to 2, changed to double buffering.
The scheduling mechanism is modified; drawing is not driven by the internal loop but is done manually. The draw call is triggered immediately upon receiving a frame.
self.metalView.enableSetNeedsDisplay = NO;
self.metalView.paused = YES;
A new high-priority queue is created for drawing, instead of handling it on the main queue.
MTKView Latency Tracking
The GPU completion time T1 is observed through the addCompletedHandler callback of the CommandBuffer.
The presentation time T2 of the frame is observed through the addPresentedHandler callback of the currentDrawable in MTKView.
Testing shows that T2 - T1 > 16.6ms (the Vsync period at 60Hz). This means that after the GPU rendering in MTLView is finished, the frame is not actually displayed at the next Vsync instruction but only at the Vsync instruction after that.
I believe there is an extra 16.6ms of latency here, which I want to eliminate by adjusting the rendering mechanism.
Observation from Instruments
From Instruments, the Surface presentation aligns with the above test results. After the Metal encoder finishes, the Surface in Display switches only after the next-next Vsync instruction. See the image in the link for details.
Questions
According to a beginner's understanding, after MTKView's GPU rendering is finished, the next Vsync instruction should officially display (make it visible). However, this is not what is observed. Does the subview MTKView need to wait for another Vsync cycle to be drawn to the actual display buffer?
The label updates its text at 60fps, so the entire interface should be displayed at 60fps. Is the content of MTKView not synchronized when the display happens?
Explanation of the Reasoning Behind Some MTKView Code Details
Changing from the default triple buffering to double buffering helps reduce the latency introduced by rendering.
Not using MTKView's own scheduling mechanism but using manual triggering of the draw method is because MTKView's own scheduling mechanism is driven by CADisplayLink. Therefore, if a frame falls within a Vsync window, it needs to wait for the next Vsync window to trigger the draw operation, which introduces waiting latency.
Hello,
I am new application developer that has been developing several applications in the productivity and finance sections concurrently for about 1 year. One of my applications is nearly ready to be submitted to the App Store.
I have received a lot of discouragement from people who have submitted apps in regards to putting submitting as a paid app, however due to all of the upfront and ongoing investment I've made, I do not wish to release my application for free initially.
(I am learning how to best integrate storekit and in-app purchases and subscriptions, but I'm not ready to implement that yet)
QUESTION:
When releasing an app as a paid app and then converting to a FREE app with subscription later on, is there anything I need to be aware of technically or in regards to guidelines so I don't shoot myself in the foot when changing pricing?
Any advice is greatly appreciated. Thank you.
Topic:
App Store Distribution & Marketing
SubTopic:
General
Tags:
App Store
App Review
In-App Purchase
My team has developed an app with a Matter commissioner feature (for own ecosystem) using the Matter framework on the MatterSupport extension.
Recently, we've noticed that commissioning Matter devices with the MatterSupport extension has become very unstable. Occasionally, the HomeUIService stops the flow after commissioning to the first fabric successfully, displaying the error: "Failed to perform Matter device setup: Error Domain=HMErrorDomain Code=2." (normally, it should send open commissioning window to the device and then add the device to the 2nd fabric). The issue is never seen before until recently few weeks and there is no code changes in the app. We are suspected that there is some data that fail to download from the icloud or apple account that cause this problem.
For evaluation, we tried removing the HomeSupport extension and run the Matter framework directly in developer mode, this issue disappears, and commissioning works without any problems.
Topic:
App & System Services
SubTopic:
Core OS
Tags:
HomeKit
Provisioning Profiles
Matter
ThreadNetwork
Can I use them in SK and do the animations work?
Thanks, Patrick
Hi everyone,
I'm running into an issue with AVAudioRecorder when handling interruptions such as phone calls or alarms.
Problem:
When the app is recording audio and an interruption occurs:
I handle the interruption with audioRecorder?.pause() inside AVAudioSession.interruptionNotification (on .began).
On .ended, I check for .shouldResume and call audioRecorder?.record() again.
The recorder resumes successfully, but only the audio recorded after the interruption is saved. The audio recorded before the interruption is lost, even though I'm using the same file URL and not recreating the recorder.
Repro:
Start a recording with AVAudioRecorder
Simulate a system interruption (e.g., incoming call)
Resume recording after the interruption
Stop and inspect the output audio file
Expected: Full audio (before and after interruption) should be saved.
Actual: Only the audio after interruption is saved; the earlier part is missing
Notes:
According to the documentation, calling .record() after .pause() should resume recording into the same file.
I confirmed that the file URL does not change, and I do not recreate the recorder instance.
No error is thrown by the system during this process.
This behavior happens consistently when the app is interrupted and resumed.
Question:
Is this a known issue? Is there a recommended workaround for preserving the full recording when interruptions happen?
Thanks in advance!
I am able to symbolicate kernel backtraces for addresses that belong to my kext.
Is it possible to symbolicate kernel backtraces for addresses that lie beyond my kext and reference kernel code?
Sample kernel panic log
Is it possible to use the Matter.xcframework without the MatterSupport extension for onboarding a Matter device to our own ecosystem(own OTBR and matter controller) for an official App Store release?
Currently, we can achieve this in developer mode by adding the Bluetooth Central Matter Client Developer mode profile (as outlined here https://github.com/project-chip/connectedhomeip/blob/master/docs/guides/darwin.md). For an official release, what entitlements or capabilities do we need to request approval from Apple to replace the Bluetooth Central Matter Client Developer mode profile?
Thank you for your assistance.
I am developing the electronic part of product, it includes a Find-My features. I saw some forum that testing Find-My feature needs a CSR and testing token. Can anyone teach me how to apply CSR and testing token step by step?
Thank you very much.
Best regards
Sam Ng
Hi
When attempting to upload a React Native app (version 0.77) we encountered the following error:
ITMS-90426: Invalid Swift Support - The SwiftSupport folder is missing. Rebuild your app using the current public (GM) version of Xcode and resubmit it.
If to check ipa folder we can see that content includes only 2 folders:
Payload
Symbols
Could you please tell us why it does not include Swift Support folder?
We tried to use XCode - 16.2 and 16.3
Thank you
Hi,
We've noticed that this issue occurs more frequently after upgrading to iOS 18.4.1 and can result in one-way audio.
Our app uses CallKit with WebRTC to establish VoIP connections.
However, on iOS 18.4.1, CallKit no longer triggers:
func provider(_ provider: CXProvider, didActivate audioSession: AVAudioSession)
We're currently comparing the occurrence rate across different iOS versions to better understand the impact.
Could you please help analyze the root cause of this issue?
When I use my iPhone to scan the apple pay QR code in chrome, the url is https://applepaydemo.apple.com/apple-pay-js-api, I keep geting the "Service Unavailable" error.
Wonder know if you guys meet this error as well? Btw, the QR code feature needs IOS 18.
Hi, I have a couple questions about background app refresh. First, is the function RefreshAppContentsOperation() where to implement code that needs to be run in the background? Second, despite importing BackgroundTasks, I am getting the error "cannot find operationQueue in scope". What can I do to resolve that? Thank you.
func scheduleAppRefresh() {
let request = BGAppRefreshTaskRequest(identifier: "peaceofmindmentalhealth.RoutineRefresh")
// Fetch no earlier than 15 minutes from now.
request.earliestBeginDate = Date(timeIntervalSinceNow: 15 * 60)
do {
try BGTaskScheduler.shared.submit(request)
} catch {
print("Could not schedule app refresh: \(error)")
}
}
func handleAppRefresh(task: BGAppRefreshTask) {
// Schedule a new refresh task.
scheduleAppRefresh()
// Create an operation that performs the main part of the background task.
let operation = RefreshAppContentsOperation()
// Provide the background task with an expiration handler that cancels the operation.
task.expirationHandler = {
operation.cancel()
}
// Inform the system that the background task is complete
// when the operation completes.
operation.completionBlock = {
task.setTaskCompleted(success: !operation.isCancelled)
}
// Start the operation.
operationQueue.addOperation(operation)
}
func RefreshAppContentsOperation() -> Operation {
}
There are multiple report of crashes on URLConnectionLoader::loadWithWhatToDo. The crashed thread in the stack traces pointing to calls inside CFNetwork which seems to be internal library in iOS.
The crash has happened quite a while already (but we cannot detect when the crash started to occur) and impacted multiple iOS versions recorded from iOS 15.4 to 18.4.1 that was recorded in Xcode crash report organizer so far.
Unfortunately, we have no idea on how to reproduce it yet but the crash keeps on increasing and affect more on iOS 18 users (which makes sense because many people updated their iOS to the newer version) and we haven’t found any clue on what actually happened and how to fix it on the crash reports. What we understand is it seems to come from a network request that happened to trigger the crash but we need more information on what (condition) actually cause it and how to solve it.
Hereby, I attach sample crash report for both iOS 15 and 18.
I also have submitted a report (that include more crash reports) with number: FB17775979.
Will appreciate any insight regarding this issue and any resolution that we can do to avoid it.
iOS 15.crash
iOS 18.crash
My app is built in flutter and has a native Widget in it. The widget's Info.plist file has the NSExtensionPrincipalClass key that links to the VerseOfTheDayWidgetBundle as string. However the Transport app throws an error:
Validation failed (409)
Unexpected Info.plist key. Unexpected key NSExtensionPrincipalClass found in extension Info.plist for Payload/Runner.app/PlugIns/VerseOfTheDayWidgetExtension.appex. (ID: 266e2dd8-44b9-4d67-97d9-d7d47013cff9)
Now I tried removing the NSExtensionPrincipalClass and but the simulator throws an error asking for the NSExtensionPrincipalClass or Storyboard class but my widget is completely written in SwiftUI and does not have a storyboard which is why I am using the NSExtensionPrincipalClass. I tried to bypass this by creating two info.plist files for my widget (debug and release) this bypassed the security check on Transporter but then the app wouldn't install in TestFlight confirming the same issue as the simulator.
I can confirm that my app's info.plist does not have any mention of NSExtensionPrincipalClass. so that isn't what is causing this error.
I tried a lot of things but it just wouldn't work, at this point I feel maybe its something wrong with the checks Transporter app. I'm not sure, please guide me.
The widget and the app work completely fine on the simulator with the NSExtensionPrincipalClass, its just the Transporter that does not like it being there.
Topic:
Developer Tools & Services
SubTopic:
Xcode
In previous versions of the simulator, it was possible to import files into the Files app by dragging them from the Finder into the Simulator. It appears that in the iOS 26 Simulator, this opens the file in Safari.
I've only tried it with .json files so far.
The documentation at https://developer.apple.com/documentation/xcode/sharing-data-with-simulator says that the original behaviour should happen:
To add files to Simulator, select one or more files in Finder on your Mac, then click the Share button. Select Simulator from the share destination list. Choose the simulated device from the drop-down list. Simulator opens the Files app, and lets you select where to save the files.
I'd love to learn if this is intentional behaviour, and if so, what workarounds there might be. I use this pattern quite a lot, as I have a HealthKit app, and I've built a system that allows me to export workouts as JSON files from a real device, that I can then import into a simulator for testing.
Edit: I found a workaround. Make a folder in Files.app, then search for it within ~/Library/Developer/CoreSimulator/Devices. Open the folder in Finder, then add any files you want to be available in the Simulator.
I'm pleased to share some significant updates that have recently been released for our Hypervisor and Virtualization frameworks. We've focused on enhancing efficiency, expanding capabilities, and addressing common developer needs. I believe these will be valuable for many of you.
Here’s a look at what’s new:
Hypervisor Updates
We've introduced support for configuring the intermediate physical address (IPA) memory granularity of a VM. This allows for more granular memory mappings, enabling granularity sizes down to 4KB. This is particularly useful for certain specialized device drivers requiring finer memory control.
Virtualization Framework Updates
More Efficient VM Image Storage with ASIF: We've integrated support for the Apple Sparse Image Format (ASIF). This results in a smaller disk footprint and optimized transfer for VM disk images when using VZDiskImageStorageDeviceAttachment, improving storage efficiency.
Custom Network Topologies with vmnet: We've added support for vmnet custom network topologies. This enables more flexible VM-to-VM communication based on logical networks with customized configurations, useful for complex testing or development environments. See VZVmnetNetworkDeviceAttachment to get started.
Simplified VM Queue Discovery: It's now easier to discover a VM’s on-process thanks to a new property on VZVirtualMachine. This should aid in development and debugging when interacting directly with the VM's queue.
These are some of the key highlights of the first beta, and I'm looking forward to seeing how these improvements will be utilized. I encourage you to explore the documentation for full details on these features.
It looks like ExtensionKit (and ExtensionFoundation) is fully available on iOS 26 but there is no mention about this in WWDC.
From my testing, it seems as of beta 1, ExtensionKit allows the app from one dev team to launch extension provided by another dev team. Before we start building on this, can someone from Apple help confirm this is the intentional behavior and not just beta 1 thing?
Has anyone been able to use Google Gemini with Intelligence in Xcode? I got the Claude models working in there but as much as I try I can't see any Gemini models.
Topic:
Developer Tools & Services
SubTopic:
Xcode
Just wanted to check here to see if anyone else is running into the issue of CarPlay not working at all on iOS 26 Beta 1, even with the update on Friday.
I plug my phone in (wired) and CarPlay never shows up. I've seen a Reddit thread where other folks are seeing the same thing.