Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Getting precise text position with Swift for MacOS
Hey there! Hope you are starting the year with great joy. My situation I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift My problem I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target. My discoveries so far I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API. Thank you in advanced.
2
0
570
Jan ’25
AssistiveTouch pointer cannot move past center of screen in landscape orientation on iPhone
Hello everyone, I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5. This has also been reported via Feedback Assistant with Feedback ID: FB17806167 Description: When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation. Specifically: The pointer cannot move past the center of the screen Horizontal and vertical (X/Y) movements appear to be swapped or misaligned Natural movement of the pointer is not possible It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape. This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps. Steps to Reproduce: Enable AssistiveTouch Connect a Bluetooth mouse to the iPhone Rotate the device to landscape orientation Try moving the mouse pointer across the screen → Notice that: Pointer cannot move past the center Horizontal/vertical input is interpreted incorrectly (as if still in portrait) Expected Behavior: The mouse pointer should move across the entire screen correctly, regardless of device orientation. Actual Behavior: In landscape orientation, the pointer is either restricted to part of the screen or misaligned. It behaves as if the device is still in portrait. Horizontal mouse movement causes vertical pointer movement, and vice versa User experience feels broken and unintuitive Feature Suggestion: Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS. I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior. Thanks in advance for any insights or suggestions. Best regards, Jannis
2
0
288
Jun ’25
PHPickerViewController No Auto Focus
The issue is, I cannot auto acquire bluetooth keyboard focus in PHPickerViewController after enabling 'Full Keyboard Access' in my IPhone 14 with iOS version 18.3.1. The keyboard focus in PHPickerViewController will show, however, after I tapped on the blank space of the PHPickerViewController. How to make the focus on at the first place then? I'm using UINavigationController and calling setNavigationBarHidden(true, animated: false). Then I use this controller to present PHPickerViewController using some configuration setup below. self.configuration = PHPickerConfiguration() configuration.filter = .any(of: filters) configuration.selectionLimit = selectionLimit if #available(iOS 15.0, *), allowOrdering { configuration.selection = .ordered } configuration.preferredAssetRepresentationMode = .current Finally I set the delegate to PHPickerViewController and call UINavigationController.present(PHPickerViewController, animated: true) to render it. Also I notice animation showing in first video then disappear.
2
0
297
Mar ’25
WhatsApp not showing in Apple Carplay in iOS 18.X
I bought a new iPhone 16 recently and connected with my car (Hyundai Venue) I couldn't able to see WhatsApp. I researched and found some forum, but the suggested steps are not workable or not suitable for latest iOS version. I have updated iOS and WhatsApp, nothing helped to resolve. Note: Earlier I was used Pixel phone I can able to see Whatsapp and I can make a call
2
0
561
Jan ’25
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
2
0
630
Sep ’25
Microphone Not Working When Running Unity Vision Pro App Normally
} // Start listening to the microphone public void StartListening() { if (!isListening) { #if UNITY_IOS || UNITY_TVOS microphoneInput = Microphone.Start(null, true, 10, 44100); #else try { microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100 if (microphoneInput == null) { microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate); } #endif isListening = true; Debug.Log(Microphone.devices.Length + " Started listening..."); debugText.text = Microphone.devices.Length + "- Started listening..."; } catch (System.Exception e) { Debug.LogError($"Starting microphone failed: {e.Message}"); debugText.text = $"Starting microphone failed: {e.Message}"; } } } void Update() { if (isListening && microphoneInput != null) { // Analyze the audio for voice activity float volume = GetAverageVolume(); if (volume > detectionThreshold) { Debug.Log("User is speaking!"); lastVoiceTime = Time.time; SoundDetected = true; if (Time.time - lastVoiceTime > silenceDuration) { Debug.Log("User is silent."); debugText.text = volume.ToString() + " - User is silent."; } slider.value = volume; } } } private float GetAverageVolume() { float[] samples = new float[128]; microphoneInput.GetData(samples, Microphone.GetPosition(null)); float sum = 0f; foreach (float sample in samples) { sum += Mathf.Abs(sample); } return sum / samples.Length; } Problem: When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected. Question: Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario? Thanks in advance for any insights! Best, Siddharth
2
0
456
Feb ’25
Best Way to Navigate to the Top Element Using VoiceOver
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need? For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient. Is there a better way to handle this using VoiceOver?
2
0
350
Mar ’25
IOS18 Crash
At present, in iOS, if using the in-house app, there may be crashes in the new iOS 18.3 and later versions, but it works normally on other phones and the certificate is not problematic. A total of 3 machines were found, and there was no pattern between the machines and the system, with different models and versions. We tested it on a machine that crashes, but the app downloaded from the store doesn't. If the same app is packaged and installed directly in the development tool, it will crash. Is this related to compatibility with the new version of IOS? Is there a solution? Do others also have relevant situations?
2
0
126
Jun ’25
Wi-Fi connectivity Issue - Captive.apple.com returns “application/octet-stream” instead of “text/html”,
In our system, when a user enables a mobile hotspot and the system connects to it, the system attempts to verify WIFI availability by sending an HTTP GET request to http://captive.apple.com. Normally, the server returns: HTTP Status: 200 (OK) Content-Type: text/html This has always been used as a sign of normal connectivity. Issue: Since last Friday, the server sometimes responds with: Content-Type: application/octet-stream When this occurs, our system determines that the network is unavailable and displays a connection warning (a “!” icon). Question: Has Apple recently made any backend or CDN configuration changes to captive.apple.com that could affect the response type? Any advice how can we solve this problem? Thanks!
2
1
952
Oct ’25
Solo Developer User Feedback Avenues
I have a couple follow up questions after the "Accessibility technologies group lab". I know it was briefly mentioned that user feedback is an excellent way to grow inclusivity in the design an app and utilizing these forums were one for example. Is inviting folks here on the forum via test flight a reasonable approach to this for a solo developer? Are there other strategies, avenues, or examples to promote user feedback?
2
0
98
Jun ’25
Nearby Interactions, wih camera assistance
I have an app that uses nearby with a custom accessory. works great on iPhone 11-13, starting with iPhone 14, one must use ARkit to get angles we have two problems ARkit is light sensitive, and we do not control the lighting where this app would run.. the 11-13 action works great even in the dark. (our users are blind, this is an accessibility app) ARkit wants to be foreground, but our uses cannot see it, and we have a voice oriented UI that provides navigation instructions.. IF ARkit is foreground, our app doesn't work. with iPhone 15 ProMax, on IOS 18, I got an error, access denied. (not permission denied) now that I am on IOS 26.. bt scan doesn't happen also fails same way on iPhone 17 on IOS26, can't callback now as release signing is no longer done this same code works ok on iOS 17.1 on iPhone 12. Info.plist here info.txt if(SearchedServices == [] ){ services = [TransferService.serviceUUID,QorvoNIService.serviceUUID] } logger.info( "scannerready, starting scan for peripherals \(services) and devices \(IDs)") filteredIDs=IDs; scanning=true; centralManager.scanForPeripherals(withServices: services, options: [CBCentralManagerScanOptionAllowDuplicatesKey: true]) the calling code dataChannel.autoConnect=autoConnect; dataChannel.start(x,ids) // datachannel.start is above self.scanning = true; return "scanning started"; ... log output services from js = and devices= 5FE04CBB services in implementation = bluetooth ready, starting scan for peripherals [] and devices ["5FE04CBB"] scannerready, starting scan for peripherals [6E400001-B5A3-F393-E0A9-E50E24DCCA9E, 2E938FD0-6A61-11ED-A1EB-0242AC120002] and devices ["5FE04CBB"] ⚡️ TO JS {"value":"scanning started"}
2
2
1.4k
Oct ’25
How to capture 48MP capture with Ultra wide lens using iPhone 16 pro max
I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet: if #available(iOS 16.0, *) { photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last! photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284) } However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration. Any guidance would be appreciated!
2
2
859
Sep ’25
To say I'm extremely hurt is an understatement! 😔
Voice Control Disabling System Services After Reboot I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem. To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months. I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem. I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter. Sincerely, Donald Spencer Kirby Dayton, Ohio
2
0
178
Jun ’25
Custom tab bar in SwiftUI
I made a (very simple) custom tab bar in SwiftUI. It's simply an HStack containing two buttons. These buttons control the selection of a paged TabView. This works well, but in VoiceOver they don't behave like the bottom tab bar or e.g. a segmented picker. Specifically, VoiceOver does not say something like "tab one of two" when the first button is focused. According to my research, in UIKit this can be accomplished by giving the container view the accessibility trait tabBar, hiding it as an accessibility element and give it the accessibility container type semanticGroup. In SwiftUI, there is also the trait isTabBar, but that does not seem to have any impact for VoiceOver. I don't see an equivalent of semanticGroup in SwiftUI. I tried accessibilityElement(children: .contain) but that also does not seem to have any impact. So, is there any way in SwiftUI to make a button behave like a tab-button in VoiceOver? And how is SwiftUI's isTabBar accessibility trait supposed to be used?
2
0
334
Aug ’25
Keyboard navigation not working in native iOS Wallet interface
Hi guys, I'm facing an issue with the native interface to add a card into the wallet - does someone have some ideas on how to fix/work around that? STEPS TO REPRODUCE: Disable VoiceOver (Settings → Accessibility → VoiceOver → Off). Connect and confirm that you can navigate other iOS interfaces using an external keyboard. In any app, present a PKAddPassesViewController with a valid .pkpass file. When the Wallet “Add Pass” sheet appears, attempt to navigate using only the external keyboard (Tab/Arrow/Enter). Observe that focus does not move to the Cancel or Add buttons, and no elements receive keyboard focus. EXPECTED RESULT: All interactive elements in PKAddPassesViewController (e.g., Cancel and Add) should be fully keyboard accessible without requiring VoiceOver. Users should be able to navigate, select, and complete actions using only a hardware keyboard. ACTUAL RESULT: Keyboard navigation is not possible. No elements receive focus. Users cannot activate Cancel or Add buttons using keyboard input. The only way to interact is by touch or enabling VoiceOver, which does not satisfy keyboard accessibility requirements. IMPACT: Violates WCAG 2.1 Success Criterion 2.1.1 (Keyboard Accessible). Prevents keyboard-only users (including users with motor disabilities) from adding passes to Wallet. Affects users of external keyboards who rely on tab/arrow navigation. Creates an inconsistent accessibility experience compared to other iOS system modals.
2
0
1.4k
Aug ’25
Having trouble with Accessibility API of the ApplicationServices framework
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1 CFTypeRef frontWindow = NULL; AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );     if ( err != kAXErrorSuccess ){         NSLog(@"failed with error: %i",err);     } NSLog(@"app: %@  frontWindow: %@",app,frontWindow); 'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
2
0
802
Jun ’25