After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window
AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1
CFTypeRef frontWindow = NULL;
AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );
if ( err != kAXErrorSuccess ){
NSLog(@"failed with error: %i",err);
}
NSLog(@"app: %@ frontWindow: %@",app,frontWindow);
'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
General
RSS for tagExplore best practices for creating inclusive apps that cater to users with diverse abilities
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Created
I’ve noticed that the VoiceOver reads currency amounts correctly when they are below thousand.
Then, for higher amounts, for example 12.225,34 € VoiceOver reads ‘twelve point two two five thirty four euros’
If the amount is formatted without the thousand separator (12225,34 €) this problem doesn’t exist. (VO reads twelve thousand two hundred and twenty five euros and thirty four cents)
Why is the thousand separator a problem for VoiceOver if this formatting is coming from the currency and locale?
This issue exists in English. I changed my device language to Italian and German and in both cases the number was read correctly even with the separator.
Is there a way to make it work in English?
I'm currently testing the announce notifications feature and I can't seem to find out how to make Siri read aloud the current currency instead of dollars.
My locale is es-CL (Chile). It uses the currency symbol $ and reads as Pesos locally or Chilean Pesos where the number 5000.1 is represented as 5.000,1
This is the notification content
let content = UNMutableNotificationContent()
content.body = "¡Has recibido un pago por $5.000!"
Siri reads it aloud as "¡Has recibido un pago por 5.000 Dolares!" which translates to "You have received a payment for 5,000 Dollars", instead of the expected "¡Has recibido un pago por 5.000 Pesos!" -> "You have received a payment for 5,000 Pesos"
I've tried changing the development region of the app, interpolating the string with NumberFormatter.localizedString(from: 5000, number: .currency), and with others styles( .currencyAccounting, .currencyISOCode and .currencyPlural) without good results. The last one seems to work buts it's not ideal since it outputs "5.000 pesos chilenos" which gets read as "5 pesos chilenos" which is not the correct amount (bug), it's as is you're not on Chile and I personally prefer it to be a symbol instead of words.
I'm testing with my device which is setup with the region "Chile"
Could someone help me find a solution?
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Localization
User Notifications
Siri and Voice
We currently have an odd issue with VoiceOver spelling a word letter by letter while the same word is spoken as a whole for other items.
The app is in German.
I have a View in SwiftUI whose button traits are removed, then a label "Start Tab 1 von 5" is added. "Tab is spoken as a whole word here, all fine.
If I change the label to "Tab-Schaltfläche" or for example "SimplyGo Tab 3 von 5", then "Tab" is spoken as "T A B", letter by letter. is there a way to force VoiceOver to speak it as a whole?
Topic:
Accessibility & Inclusion
SubTopic:
General
A lot of apps use undocumented App-prefs URLs to help users get to the iOS Settings screen needed to set up the app. In iOS 18, it seems like these all stopped working.
Here are the ones I currently use:
App-prefs:MESSAGES - broken in iOS 18
Used for SMS Protection.
App-prefs:Phone - broken in iOS 18
Used for Live Voicemail, Silence Unknown Callers, and SMS Reporting.
Some but not most paths have specific documented replacements. E.g. for Call Blocking & Identification you can use CXCallDirectoryManager.sharedInstance.openSettings() and this still works in iOS 18. But I don't see any other direct replacements.
Apple probably doesn't consider this a bug but I filed FB14378568 anyway.
I consider this an accessibility issue because many older, inexperienced, or users with disabilities have trouble finding the right Settings screen based on a textual description alone.
All of the Sports widgets in iOS 18 have disappeared for older models, people relied on these for sports updates!! Not even the new Apple Sports app has a widget...
Topic:
Accessibility & Inclusion
SubTopic:
General
accessibilityUserInputLabels is working fine with any view I tried this on. Meaning that the control can be toggled with the provided alternative names when using Voice Control.
When setting this property on any UIBarButtonItem though, it seems Voice Control ignores the alternative names provided by setting accessibilityUserInputLabels. For comparison, accessibilityLabel works perfectly when set on UIBarButtonItem.
Is anyone facing the same issue?
Using Xcode 16.0 (16A242) on iOS 18
I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet:
if #available(iOS 16.0, *) {
photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last!
photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284)
}
However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration.
Any guidance would be appreciated!
i downloaded ios 18.2 and siri returned to the old ios 17 siri.iIt wont respons.i have to click the button many times to respond and when it does it goes away.also genmoji isnt downloading
i have iphone 15 pro max
Topic:
Accessibility & Inclusion
SubTopic:
General
Hello community,
We're designing an app that can optionally be controlled by a stylus with a mesh tip. In this case, the mesh tip we're using is 5 mm in diameter. It seems that mesh tip contact detection is unstable in this size, although it works better with a larger diameter.
Is it possible to access a setting in iOS that lets you define the minimum contact area needed to detect a contact on the screen? This would enable us to use this 5 mm stylus.
Best regards,
Edwin
Topic:
Accessibility & Inclusion
SubTopic:
General
if you are on the tik tok website on safari, you are able to view a video that originally brought you to the website the from the search log, but if you want to click on another video listed on the website, it claims you need to use the app to go farther, and upon proceeding it just brings you to the App Store regardless if you have the app already or not , and you are unable to view the video displayed on the website without searching for it separately on the app.
Topic:
Accessibility & Inclusion
SubTopic:
General
When my app is in the background, I create a Live Activity through a push notification with token get from pushToStartTokenUpdates, and this process works fine. However, without opening the app, how can I retrieve the new push token for this Live Activity again and use it for subsequent updates to the Live Activity content?
When My Usb interface working on recording, the sync is not good work. I found every IO will in_io_buffer_frame_size is same, it is not sync to UpdateCurrentZeroTimestamp. So The Audio driver Kit Read opration is not same like Write? What is the way sync with Usb in data?
If only play audio with UpdateCurrentZeroTimestamp, it working fine.
Thanks!
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac
However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day?
For context, my app is a macOS-native SwiftUI app.
I am currently developing a macOS application that listens for system-wide mouse clicks to simulate typing with user-provided text. The app requires Accessibility permissions to function properly, and I want to ensure compliance with Apple’s latest privacy and security guidelines.
The app listens to global mouse clicks.
It simulates keyboard input with user-provided text
I would like detailed guidance on the following aspects:
What specific entitlements are required to allow system-wide mouse click monitoring and simulating user input ?
App Sandbox enable or disable?
what keys required to explain global mouse click monitoring and keyboard input simulation in the info.plist
What will be the configuration of Privacy Manifest
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
Frameworks
Entitlements
Privacy
Accessibility
Hi,
I have an iOS app where bluetooth scanner and bluetooth keyboard both should work simultaneously.
I have used 'pressesBegan' method to fetch the characters which are coming from the bluetooth keyboard. This method is fetching all the bluetooth keyboard inputs correctly. But this method is also called when the characters are coming from the bluetooth scanner.
In the 'pressesBegan' method, I have to separate the inputs which are coming from the bluetooth keyboard and are coming from bluetooth scanner. They both have some different use in the app. I have already tried with the fetching speed of the characters, but no luck in case of high speed typing. So characters fetching speed will not work in our case.
Is there any way to separate the inputs based on some other factorials?
Or any other info in the 'pressesBegan' method which can separate the input that it is coming from scanner or is coming from keyboard.
Any suggestion regarding this will be helpful.
Thanks in advance.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey there! Hope you are starting the year with great joy.
My situation
I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift
My problem
I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target.
My discoveries so far
I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API.
Thank you in advanced.
I bought a new iPhone 16 recently and connected with my car (Hyundai Venue) I couldn't able to see WhatsApp. I researched and found some forum, but the suggested steps are not workable or not suitable for latest iOS version.
I have updated iOS and WhatsApp, nothing helped to resolve.
Note: Earlier I was used Pixel phone I can able to see Whatsapp and I can make a call
有在Info.plist配置权限使用说明(NSLocalNetworkUsageDescription),App本地网络权限也已经打开,但App请求局域网设备接口时,仍返回异常:
Error Domain=NSURLErrorDomain Code=-1009 "The Internet connection appears to be offline." UserInfo={_kCFStreamErrorCodeKey=50, NSUnderlyingError=0x302d79d40 {Error Domain=kCFErrorDomainCFNetwork Code=-1009 "(null)" UserInfo={_NSURLErrorNWPathKey=unsatisfied (Local network prohibited), interface: en0[802.11], ipv4, uses wifi, _kCFStreamErrorCodeKey=50, _kCFStreamErrorDomainKey=1}}, _NSURLErrorFailingURLSessionTaskErrorKey=LocalDataTask .<1>, _NSURLErrorRelatedURLSessionTaskErrorKey=(
"LocalDataTask .<1>"
), NSLocalizedDescription=The Internet connection appears to be offline., NSErrorFailingURLStringKey=http://192.168.1.1:80/xxx, NSErrorFailingURLKey=http://192.168.1.1:80/xxx, _kCFStreamErrorDomainKey=1}
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi All,
I am develop the a braille keyboard with iOS, when I testing the typing function in notes , the screen update is very slow after typing, we suppose the respond should been instance change, I am not sure how to setup voiceover setting.
does any document supply on this issue?
or does a any guideline for this?