(Xcode 26.2, iPhone 17 Pro)
I can't seem to get hardware tag checks to work in an app launched without the special "Hardware Memory Tagging" diagnostics. In other words, I have been unable to reproduce the crash example at 6:40 in Apple's video "Secure your app with Memory Integrity Enforcement".
When I write a heap overflow or a UAF, it is picked up perfectly provided I enable the "Hardware Memory Tagging" feature under Scheme Diagnostics.
If I instead add the Enhanced Security capability with the memory-tagging related entitlements:
I'm seeing distinct memory tags being assigned in pointers returned by malloc (without the capability, this is not the case)
Tag mismatches are not being caught or enforced, regardless of soft mode
The behaviour is the same whether I launch from Xcode without "Hardware Memory Tagging", or if I launch the app by tapping it on launchpad. In case it was related to debug builds, I also tried creating an ad hoc IPA and it didn't make any difference.
I realise there's a wrinkle here that the debugger sets MallocTagAll=1, so possibly it will pick up a wider range of issues. However I would have expected that a straight UAF would be caught. For example, this test code demonstrates that tagging is active but it doesn't crash:
#define PTR_TAG(p) ((unsigned)(((uintptr_t)(p) >> 56) & 0xF))
void *p1 = malloc(32);
void *p2 = malloc(32);
void *p3 = malloc(32);
os_log(OS_LOG_DEFAULT, "p1 = %p (tag: %u)\n", p1, PTR_TAG(p1));
os_log(OS_LOG_DEFAULT, "p2 = %p (tag: %u)\n", p2, PTR_TAG(p2));
os_log(OS_LOG_DEFAULT, "p3 = %p (tag: %u)\n", p3, PTR_TAG(p3));
free(p2);
void *p2_realloc = malloc(32);
os_log(OS_LOG_DEFAULT, "p2 after free+malloc = %p (tag: %u)\n", p2_realloc, PTR_TAG(p2_realloc));
// Is p2_realloc the same address as p2 but different tag?
os_log(OS_LOG_DEFAULT, "Same address? %s\n",
((uintptr_t)p2 & 0x00FFFFFFFFFFFFFF) == ((uintptr_t)p2_realloc & 0x00FFFFFFFFFFFFFF)
? "YES" : "NO");
// Now try to use the OLD pointer p2
os_log(OS_LOG_DEFAULT, "Attempting use-after-free via old pointer p2...\n");
volatile char c = *(volatile char *)p2; // Should this crash?
os_log(OS_LOG_DEFAULT, "Read succeeded! Value: %d\n", c);
Example output:
p1 = 0xf00000b71019660 (tag: 15)
p2 = 0x200000b711958c0 (tag: 2)
p3 = 0x300000b711958e0 (tag: 3)
p2 after free+malloc = 0x700000b71019680 (tag: 7)
Same address? NO
Attempting use-after-free via old pointer p2...
Read succeeded! Value: -55
For reference, these are my entitlements.
[Dict]
[Key] application-identifier
[Value]
[String] …
[Key] com.apple.developer.team-identifier
[Value]
[String] …
[Key] com.apple.security.hardened-process
[Value]
[Bool] true
[Key] com.apple.security.hardened-process.checked-allocations
[Value]
[Bool] true
[Key] com.apple.security.hardened-process.checked-allocations.enable-pure-data
[Value]
[Bool] true
[Key] com.apple.security.hardened-process.dyld-ro
[Value]
[Bool] true
[Key] com.apple.security.hardened-process.enhanced-security-version
[Value]
[Int] 1
[Key] com.apple.security.hardened-process.hardened-heap
[Value]
[Bool] true
[Key] com.apple.security.hardened-process.platform-restrictions
[Value]
[Int] 2
[Key] get-task-allow
[Value]
[Bool] true
What do I need to do to make Memory Integrity Enforcement do something outside the debugger?
Overview
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I've created node-accelerate.
High-performance Apple Accelerate framework bindings for Node.js. Get up to 305x faster matrix operations and 5-10x faster vector operations on Apple Silicon (M1/M2/M3/M4).
https://www.npmjs.com/package/@digitaldefiance/node-accelerate
Topic:
Community
SubTopic:
Apple Developers
It is vital for Apple to refine its OCR models to correctly distinguish between Khmer and Thai scripts. Incorrectly labeling Khmer text as Thai is more than a technical bug; it is a culturally insensitive error that impacts national identity, especially given the current geopolitical climate between Cambodia and Thailand. Implementing a more robust language-detection threshold would prevent these harmful misidentifications.
There is a significant logic flaw in the VNRecognizeTextRequest language detection when processing Khmer script. When the property automaticallyDetectsLanguage is set to true, the Vision framework frequently misidentifies Khmer characters as Thai.
While both scripts share historical roots, they are distinct languages with different alphabets. Currently, the model’s confidence threshold for distinguishing between these two scripts is too low, leading to incorrect OCR output in both developer-facing APIs and Apple’s native ecosystem (Preview, Live Text, and Photos).
import SwiftUI
import Vision
class TextExtractor {
func extractText(from data: Data, completion: @escaping (String) -> Void) {
let request = VNRecognizeTextRequest { (request, error) in
guard let observations = request.results as? [VNRecognizedTextObservation] else {
completion("No text found.")
return
}
let recognizedStrings = observations.compactMap { observation in
let str = observation.topCandidates(1).first?.string
return "{text: \(str!), confidence: \(observation.confidence)}"
}
completion(recognizedStrings.joined(separator: "\n"))
}
request.automaticallyDetectsLanguage = true // <-- This is the issue.
request.recognitionLevel = .accurate
let handler = VNImageRequestHandler(data: data, options: [:])
DispatchQueue.global(qos: .background).async {
do {
try handler.perform([request])
} catch {
completion("Failed to perform OCR: \(error.localizedDescription)")
}
}
}
}
Recognizing Khmer
Confidence Score is low for Khmer text. (The output is in Thai language with low confidence score)
Recognizing English
Confidence Score is high expected.
Recognizing Thai
Confidence Score is high as expected
Issues on Preview, Photos
Khmer text
Copied text
Kouk Pring Chroum Temple [19121 รอาสายสุกตีนานยารรีสใหิสรราภูชิตีนนสุฐตีย์ [รุก
เผือชิษาธอยกัตธ์ตายตราพาษชาณา ถวเชยาใบสราเบรถทีมูสินตราพาษชาณา ทีมูโษา เช็ก
อาษเชิษฐอารายสุกบดตพรธุรฯ ตากร"สุก"ผาตากรธกรธุกเยากสเผาพศฐตาสาย รัอรณาษ"ตีพย"
สเผาพกรกฐาภูชิสาเครๆผู:สุกรตีพาสเผาพสรอสายใผิตรรารตีพสๆ เดียอลายสุกตีน
ธาราชรติ ธิพรหณาะพูชุบละเาหLunet De Lajonquiere ผารูกรสาราพารผรผาสิตภพ ตารสิทูก ธิพิ
คุณที่นสายเระพบพเคเผาหนารเกะทรนภาษเราภุพเสารเราษทีเลิกสญาเราหรุฬารชสเกาก เรากุม
สงสอบานตรเราะากกต่ายภากายระตารุกเตียน
Recommended Solutions
1. Set a Threshold
Filter out the detected result where the threshold is less than or equal to 0.5, so that it would not output low quality text which can lead to the issue.
For example,
let recognizedStrings = observations.compactMap { observation in
if observation.confidence <= 0.5 {
return nil
}
let str = observation.topCandidates(1).first?.string
return "{text: \(str!), confidence: \(observation.confidence)}"
}
2. Add Khmer Language Support
This issue would never happen if the model has the capability to detect and recognize image with Khmer language.
Doc2Text GitHub: https://github.com/seanghay/Doc2Text-Swift
In an AppleScript applet, compiling and exporting in Script Editor replaces a custom icon with the default. To retain a custom icon, it is necessary, after exporting, to use Finder's "Get info..." to copy the icon from another file and paste into the icon for the applet. The custom icon is stored in the "Icon?" file, located in the root of the applet bundle. The applet can then be signed and notarized.
With macOS Tahoe, that procedure no longer works. That is because the notarization process now wipes the "Icon?" file. The file remains in place but has zero size. Thus Finder shows the default applet icon.
Does anyone know of a way to provide a custom icon for a signed and notarized AppleScript applet ?
Hi,
I was testing the new iOS 18 behavior where NSPersistentCloudKitContainer wipes the local Core Data store if the user logs out of iCloud, for privacy purposes.
I ran the tests both with a Core Data + CloudKit app, and a simple one using SwiftData with CloudKit enabled. Results were identical in either case.
In my testing, most of the time, the feature worked as expected. When I disabled iCloud for my app, the data was wiped (consistent with say the Notes app, except if you disable iCloud it warns you that it'll remove those notes). When I re-enabled iCloud, the data appeared. (all done through the Settings app)
However, in scenarios when NSPersistentCloudKitContainer cannot immediately sync -- say due to rate throttling -- and one disables iCloud in Settings, this wipes the local data store and ultimately results in data loss.
This occurs even if the changes to the managed objects are saved (to the local store) -- it's simply they aren't synced in time.
It can be a little hard to reproduce the issue, especially since when you exit to the home screen from the app, it generally triggers a sync. To avoid this, I swiped up to the screen where you can choose which apps to close, and immediately closed mine. Then, you can disable iCloud, and run the app again (with a debugger is helpful). I once saw a message with something along the lines of export failed (for my record that wasn't synced), and unfortunately it was deleted (and never synced).
Perhaps before NSPersistentCloudKitContainer wipes the local store it ought to force sync with the cloud first?
I have a single multiplatform application that I use NSPersistentCloudKitContainer on.
This works great, except I noticed when I open two instances of the same process (not windows) on the same computer, which share the same store, data duplication and "Metadata Inconsistency" errors start appearing.
This answer (https://stackoverflow.com/a/67243833) says this is not supported with NSPersistentCloudKitContainer.
Is this indeed true?
If it isn't allowed, is the only solution to disable multiple instances of the process via a lock file? I was thinking one could somehow coordinate a single "leader" process that syncs to the cloud, with the others using NSPersistentContainer, but this would be complicated when the "leader" process terminates.
Currently, it seems iPad split views are new windows, not processes -- but overall I'm still curious :0
Thank you!
What OS will a Swift Student Challenge submission run on? I want to use iOS 26 features but the version history for Swift Playground doesn’t show it being updated past the iOS 17.5 SDK. So, can I still use features from the iOS 26 SDK?
I'm reading the "Testing Age Assurance in Sandbox" doc, but I couldn't figure out the step:
2. Tap Sandbox Testing from the main menu
Where is the "main menu"?
We are integrating Apple’s DeclaredAgeRange SDK. To comply with relevant regulatory requirements, our understanding is as follows:
The app is only required to obtain the declared age range for users located in Texas.
For users outside of Texas, we should not proactively request age range information.
Accordingly, we would like to confirm the following:
Are we required to present the age range request prompt to all users in the United States?
If yes, we are concerned that this may significantly impact the overall user experience.
If it is permissible to request age range only for Texas users, how can we reliably determine whether a user is located in Texas on the client side?
For example, does Apple provide an API or recommended method for accurately identifying a user’s region (specifically Texas)?
Happy new year to all!
I have created an iOS app that also runs on Apple Vision Pro.
On iOS, when you activate the fileImporter modal, you can swipe down the modal in iOS to dismiss.
However, in visionOS, this same modal CANNOT be swiped down to cancel/dismiss. If you are drilled deep into a file hierarchy, you have to navigate back to the top level to tap X to dismiss.
Is there a way to add swipe down to the visionOS implementation of fileImporter, or any other workaround so the user doesn't have to navigate back to the top to dismiss?
Again, this is not a visionOS app but an iOS app compatible for use in Vision Pro.
Thanks!
Hi everyone,
I subscribed to the Apple Developer Program on Tuesday evening, November 4th, 2025. The payment has already been charged to my bank account, but my account still shows the status “Pending” with the message “Subscribe your membership”.
It’s now been several days, and I haven’t received any confirmation email or any request for additional information.
I already contacted Apple Support by email, but I’d like to know if other developers have experienced the same situation and how long it took before their account was activated.
Thanks in advance for your help and feedback!
— Martin
Hello,
I subscribed to the Apple Developer Program and the annual fee was successfully charged from my card, however my membership status is still showing as Pending and has not been activated yet.
I have already contacted Apple Developer Support via email, but I haven’t received any response so far.
Topic:
Developer Tools & Services
SubTopic:
General
Hi, I am developing IOS(Android App) with React Native.
I am very confused about cocoapods and pod and how to correctly install it on my new Macbook Pro M4. I am not using bash but I am using zsh. Note, actuallywhich pod return nothing
During the preparation of my environment, it say
CocoaPods is one of the dependency management system available for iOS. CocoaPods is a Ruby gem. You can install CocoaPods using the version of Ruby that ships with the latest version of macOS.
the web site show two commands
gem install cocoapods
sudo gem install cocoapods
I saw another command as well
brew install cocoapods
During different processes, I experienced several time the following error (Command 'pod install failed)
Command pod install failed.
└─ Cause: pod install --repo-update --ansi exited with non-zero code: 1
Then I am confused about cocoapods and pod. Are both he same?
With my previous MacBook pro, I spend time to install cocoapod on my profile because Ruby was not the latest version on the system. But apparently, on my new Macbook Pro M4, the command ruby -v return (as well)
ruby 2.6.10p210 (2022-04-12 revision 67958) [universal.arm64e-darwin25]
The current stable version is 4.0.0.
I bought a new macbook pro M4 and I reinstalled node and all package for Rect Native 0.81 a expo 54 excepted cocoapods. Now, I need to configure the push notification and it's time to install cocoapods as it's require here
But on my new macbook pro, I would like to make sure I do it correctly and I kindly ask your help and recommandation to install Ruby and cocoapods/pod
Q1: Should I install cocoapods with brew install cocoapods or gem install cocoapods?
Q2: what's is the difference or the common point with cocoapods and pod?
Cocoapod web site said
If using the default Ruby included with macOS, installation will require you to use sudo when installing
gems
As ruby -v print 'ruby 2.6.10p210', I suppose, I should not install cocoapod with sudo
You can use a Ruby Version manager such as RVM or rbenv
to manage multiple Ruby versions, or you can use Homebrew to install a newer Ruby with brew install ruby.
As far I understand, I should not install cocoapods with the Ruby version of the system, then I suppose the command
Q3: Will 'brew install cocopads' install the latest version on my profile? Will it upgrade the system version
Q4: What will do the command
brew install rbenv ruby-build
rbenv install 3.2.2 (or better: rbenv install 4.0.0)
in comparison with
brew install ruby
My guess
I suppose that the following will help, but it would nice if you could correct me and clarify
# All should be done in my profile
brew install rbenv ruby-build
echo 'eval "$(rbenv init - bash)"' >> ~/.zprofile
source ~/.zprofile
rbenv install 4.0.0
# rbenv global 4.0.0 # What is it?
ruby -v
gem install cocoapods
Q5: But then, what about pod and the error message
Command pod install failed.
As you can see, I am a bit confused and I would appreciate your clarification
I thanks you for your help and clarification and I wish you a happy new years
Topic:
App & System Services
SubTopic:
Core OS
I have an app with a small but devoted following. It has not been upgraded since 2022 and has been working very well. On iOS 26 it crashes almost at startup.
After hooking up to Xcode and running I get this message in the console:
objc[64686]: Class PSSegment is implemented in both /System/Library/PrivateFrameworks/PolarisGraph.framework/PolarisGraph (0x291ed9f78) and /private/var/containers/Bundle/Application/08486FCF-548A-467C-8BA3-D722734463FC/HikeTracker.app/HikeTracker.debug.dylib (0x101d309e8). This may cause spurious casting failures and mysterious crashes. One of the duplicates must be removed or renamed.
PSSegment is the name of an entity in my Core Data managed object model. If I refactor it to P_Segment the app starts.
PolarisGraph means nothing to me.
The "PS" stands for Persistent Store, but in this case it seems that PolarisGraph is PSing in my sandbox. How can this happen?
I'll attach the longer message that comes with the crash.
error messages.txt
Topic:
Community
SubTopic:
Apple Developers
Apologies if this is not the correct topic to post under.
EpochField 5.2 is our application. It's a .NET MAUI application built against XCode 16. A customer of ours uses another app, TN3270, to connect to a mainframe host. After installing our app on an iPad and restarting the device, the TN3270 app will disconnect when suspended. Uninstalling our app (EpochField) will allow the TN3270 to suspend without disconnecting. We have tried removing background services, setting UIRequiresFullScreen to false or removing it entirely, and several other ideas. The only remedy seems to be uninstalling EpochField.
On an iPad device:
Install MochaSoft’s TN3270 app (free version is fine). Create a connection to ssl3270.nccourts.org, port 2023, SSL/TLS turned on, keep alive turned on.
Verify that you can connect. Suspend the app by swiping up or choosing another app. Go back to TN3270 and verify that the app has not disconnected.
Install EpochField 5.2. Do not run or configure the app, just install it.
Repeat step 2.
Restart the device.
Open EpochField 5.2. You do not need to configure the app or login. Sometimes it isn't necessary to ever open EpochField to get the disconnects, but this is the most reliable way to reproduce the situation.
Repeat step 2. The TN3270 app will now disconnect when suspended, even if EpochField is closed. You may need to wait a few seconds after suspending.
Uninstall EpochField 5.2.
Repeat step 2: the TN3270 app will now remain connected when suspended.
Topic:
App & System Services
SubTopic:
Networking
I am creating an Augmented Reality iOS (Not VisionOS) app using scenes created in Reality Composer Pro.
I'd like my code to send a notification to a RCP scene that plays a timeline. The RCP interface has the option to set up a behaviour for this purpose:
This Forum thread https://developer.apple.com/forums/thread/756978 suggests the code I need for sending a notification is:
name: NSNotification.Name("RealityKit.NotificationTrigger"),
object: nil,
userInfo: [
"RealityKit.NotificationTrigger.Scene": scene,
"RealityKit.NotificationTrigger.Identifier": "HideCharacter"
]
)
but the 'scene' var needs to point to the relevant RCP scene, which is loaded within a UIViewRepresentable ARView (because even in iOS26 it seems RealityKit/RealityViews aren't quite ready for AR use) and I can't work out how to correctly access it. Examples in the link above are for working with RealityKit and VisionOS only.
Code for loading the scene is as follows. How can I get the notification code above to be situated in a separate SwiftUI View and send the notification to the RCP scene?
typealias UIViewType = ARView
func makeUIView(context: Context) -> ARView {
// Create an ARView
let arView = ARView(frame: .zero)
// Configure it
let arConfiguration = ARWorldTrackingConfiguration()
arConfiguration.planeDetection = [.horizontal]
arView.session.run(arConfiguration)
// Load in Reality Composer Pro scene
let scene = try! Entity.load(named:"myScene)", in: realityKitContentBundle)
// Create a horizontal plane anchor
let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: SIMD2<Float>(0.2, 0.2)))
// Append the scene to the anchor
anchor.children.append(scene)
// Append the anchor to the ARView
arView.scene.anchors.append(anchor)
return arView
}
func updateUIView(_ uiView: ARView, context: Context) {
}
}
What am I missing here? 😐
Prepare all required actions
Run ./.github/actions/verify-copyright
Run echo "📋 Checking copyright file..."
📋 Checking copyright file...
✅ Copyright file exists at: fastlane/metadata/en-US/copyright.txt
📝 Copyright value: © 2025 Enterprise Support
✅ Copyright format valid: © 2025 Enterprise Support
✅ Copyright metadata validation passed
Topic:
App Store Distribution & Marketing
SubTopic:
App Store Connect API
Have a 2019 Ford Edge w/ Sync 3.4, wired carplay. Worked fine w/ iPhone 16 Pro on iOS 18. Upgraded to iPhone 17 Pro, came w/ iOS 26, carplay hasn't worked since.
I've kept trying throughout new iOS 26 releases, lately with iOS 26.3 Public Beta 1, still not working.
Have a long running issue with updates and system diagnostics as I've tried over the last few months: FB20739050
There is also a Apple support community thread with issues like this (and a ton of others) - my first post there was https://discussions.apple.com/thread/256138283?answerId=261613103022&sortBy=oldest_first#261613103022
I'm hoping here in the developer forums someone can maybe take a look at the feedback item and various system diagnostics to pin-point the issue. I'm a little concerned it's still not fixed this far into the follow-up point releases of iOS 26.
Appreciate any help, thanks!
--Chuck
I have filed 27 bug reports against macOS using Feedback Assistant and none have had any sort of follow-up.
I'm wondering if anyone at Apple looks at these reports?
A few of these involve accessibility features that are not working (for example: speak announcements doesn't work -- and yes, my feedback is much more detailed than "doesn't work"). I would have thought issues with accessibility would be a high priority for Apple to fix quickly.
Another report is that Grapher uses black text on dark gray making it very difficult to see the formula you enter. This should be a one-line code fix.
These two, and 25 other bug reports have seemingly been ignored. Several of these reports indicate more than 10 similar reports, yet go unfixed. Leading me back to my opening question -- does anyone at Apple read these reports?
I'd like to think I'm helping Apple to deliver a more perfect product but I feel like I'm wasting my time writing detailed bug reports. Is there another way to bring these to Apple's attention?
Topic:
Community
SubTopic:
Apple Developers
Hello everyone,
My app has been rejected under guideline 4.3(a) – Design – Spam.
It is a padel sports-tracking app with:
– Bluetooth sensor connection
– live session tracking
– AI-generated coaching feedback
– community feed and notifications
– video analysis features
The review states that my app “shares a similar binary, metadata and/or concept as other apps with only minor differences”.
However, this app is fully custom developed, not generated from any template, and not a reskinned clone. It is part of a larger project including custom hardware sensors and backend.
I would like to better understand what exactly triggered the rejection:
– Is it the design?
– A specific feature?
– The metadata / screenshots?
– Or similarity detection with another app name or concept?
My goal is to comply fully with Apple guidelines and improve the app if needed, but I need more concrete direction to do so.
Any guidance from Apple engineers or other developers who have faced 4.3(a) would be appreciated.
Thank you in advance 🙏
Topic:
App Store Distribution & Marketing
SubTopic:
App Review