Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

SwiftUI PhotosPicker accessibility issue
iOS 18.3.1, iPhone 16 Pro. I pick photos using connected physical keyboard from the user's photo library using: .photosPicker(isPresented: $viewModel.isImagePickerPresented, selection: $viewModel.selectedImageItem, matching: .images) When picker appears, accessibility focus is moved to "dynamic Island" instead of cancel button. There is no possibility to navigate by keyboard in photos picker view without tapping on this view and move focus to this view manually . I noticed the same behavior in Notes app.
1
0
526
Mar ’25
VoiceOver spells word letter by letter
We currently have an odd issue with VoiceOver spelling a word letter by letter while the same word is spoken as a whole for other items. The app is in German. I have a View in SwiftUI whose button traits are removed, then a label "Start Tab 1 von 5" is added. "Tab is spoken as a whole word here, all fine. If I change the label to "Tab-Schaltfläche" or for example "SimplyGo Tab 3 von 5", then "Tab" is spoken as "T A B", letter by letter. is there a way to force VoiceOver to speak it as a whole?
4
0
1.2k
Jan ’25
How to Receive Callbacks for UIAccessibilityAction Methods Like accessibilityPerformMagicTap()?
I’ve tried implementing the accessibilityPerformMagicTap() method in a specific UIViewController, its view, and even in AppDelegate, but I am not receiving any callbacks. I directly overrode this method in the mentioned areas, but it never gets triggered when performing a magic tap. How can I properly observe and handle the accessibilityPerformMagicTap() action?
3
0
482
Mar ’25
AssistiveTouch pointer cannot move past center of screen in landscape orientation on iPhone
Hello everyone, I’d like to report an issue I’ve encountered when using a Bluetooth mouse together with AssistiveTouch on iPhone running iOS 16.5. This has also been reported via Feedback Assistant with Feedback ID: FB17806167 Description: When using a Bluetooth mouse together with AssistiveTouch on iPhone (iOS), the pointer behaves incorrectly in landscape orientation. Specifically: The pointer cannot move past the center of the screen Horizontal and vertical (X/Y) movements appear to be swapped or misaligned Natural movement of the pointer is not possible It seems as if the internal coordinate mapping remains locked in portrait orientation, even when the device is physically rotated to landscape. This issue occurs system-wide, regardless of the current app. It is observable in Settings, on the Home screen, and in third-party apps. Steps to Reproduce: Enable AssistiveTouch Connect a Bluetooth mouse to the iPhone Rotate the device to landscape orientation Try moving the mouse pointer across the screen → Notice that: Pointer cannot move past the center Horizontal/vertical input is interpreted incorrectly (as if still in portrait) Expected Behavior: The mouse pointer should move across the entire screen correctly, regardless of device orientation. Actual Behavior: In landscape orientation, the pointer is either restricted to part of the screen or misaligned. It behaves as if the device is still in portrait. Horizontal mouse movement causes vertical pointer movement, and vice versa User experience feels broken and unintuitive Feature Suggestion: Please improve the synchronization between physical device orientation and AssistiveTouch pointer mapping on iOS. I also suggest exposing AssistiveTouch orientation control via a public API, so developers can help maintain consistent pointer behavior. Thanks in advance for any insights or suggestions. Best regards, Jannis
2
0
288
Jun ’25
PerformAccessibilityAudit and sufficientElementDescription clarification
Hi, I am writing in the hope to receive some clarification about the rationale of the Audit type sufficientElementDescription - in context with Accessibility Audit API. Please see my test below: And another example in context with Xcode, where the strings visible in the UI are also set as accessible labels of their respective elements. Thanks for your help!
1
0
513
Jan ’25
Microphone Not Working When Running Unity Vision Pro App Normally
} // Start listening to the microphone public void StartListening() { if (!isListening) { #if UNITY_IOS || UNITY_TVOS microphoneInput = Microphone.Start(null, true, 10, 44100); #else try { microphoneInput = Microphone.Start(null, true, 10, 16000); // Use 16,000 Hz instead of 44,100 if (microphoneInput == null) { microphoneInput = Microphone.Start(null, true, 10, AudioSettings.outputSampleRate); } #endif isListening = true; Debug.Log(Microphone.devices.Length + " Started listening..."); debugText.text = Microphone.devices.Length + "- Started listening..."; } catch (System.Exception e) { Debug.LogError($"Starting microphone failed: {e.Message}"); debugText.text = $"Starting microphone failed: {e.Message}"; } } } void Update() { if (isListening && microphoneInput != null) { // Analyze the audio for voice activity float volume = GetAverageVolume(); if (volume > detectionThreshold) { Debug.Log("User is speaking!"); lastVoiceTime = Time.time; SoundDetected = true; if (Time.time - lastVoiceTime > silenceDuration) { Debug.Log("User is silent."); debugText.text = volume.ToString() + " - User is silent."; } slider.value = volume; } } } private float GetAverageVolume() { float[] samples = new float[128]; microphoneInput.GetData(samples, Microphone.GetPosition(null)); float sum = 0f; foreach (float sample in samples) { sum += Mathf.Abs(sample); } return sum / samples.Length; } Problem: When I build and run the app from Xcode, the microphone works fine, and I receive input. However, when running the app normally (outside of Xcode), I can’t seem to access the microphone. The debug logs indicate no microphone is detected. Question: Is there any additional configuration I need to do for the microphone to work in a normal (non-Xcode) run on Vision Pro? Or any common issues that could be causing the microphone access to fail in this scenario? Thanks in advance for any insights! Best, Siddharth
2
0
456
Feb ’25
System clock ancs notification is not sent on iPhone >= 14
I'm working on a ble connected device that use ancs and system clock to receive alarm notification events for earing impaired people. It used to work until iPhone 13 with latest iOS 18.x. Starting with iPhone 14 onward (iOS 18.x), system clock alarm notification is not sent anymore. Is There any reason for this to happening?. Is anyone aware of this behaviour? Any suggestion would be really appreciated. Cheers
2
0
166
Jun ’25
MacOS Sequoia support for VoiceOver AppleScript automation
We are unable to programmatically enable AppleScript automation for VoiceOver on macOS 15 (Sequoia) In macOS 15, Apple moved the VoiceOver configuration from: ~/Library/Preferences/com.apple.VoiceOver4/default.plist to a sandboxed path: ~/Library/Group Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist Steps to Reproduce: Use a macOS 15 (ARM64) machine (or GitHub Actions runner image with macOS 15 ARM). Open VoiceOver: open /System/Library/CoreServices/VoiceOver.app Set the SCREnableAppleScript flag to true in the new sandboxed .plist: plutil -replace SCREnableAppleScript -bool true ~/Library/Group\ Containers/group.com.apple.VoiceOver/Library/Preferences/com.apple.VoiceOver4/default.plist Confirm csrutil status is either disabled or not enforced. Attempt to control VoiceOver via AppleScript (e.g., using osascript voiceOverPerform.applescript). Observe that the AppleScript command fails with no useful output (exit code 1), and VoiceOver does not respond to automation.
3
0
206
Jun ’25
macOS (Tahoe) Beta 26.0 — Overheating & Rapid Battery Drain on MacBook Pro M3
Hi everyone, After installing the macOS beta (Tahoe 26.0) on my MacBook Pro M3, I’ve noticed two issues: Significant increase in system temperature The laptop feels hot even with light usage like Safari and Figma Rapid battery drain Battery is dropping unusually fast compared to macOS Sonoma. I’ve tried, Restarting the device. I’m aware this is a beta, but just wondering. Is anyone else experiencing this? Is this a known issue? Would love to hear if others are facing similar problems or if it might be something specific to my setup. Thanks in advance!
3
0
483
Jun ’25
VoiceOver cursor focus tracking
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients. We need to know which item has the VoiceOver focus so we can keep track of it. setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused). At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder. Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element? Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
4
0
513
Mar ’25
To say I'm extremely hurt is an understatement! 😔
Voice Control Disabling System Services After Reboot I recently learned from Apple Accessibility Support that the issue I’m experiencing with Voice Control is now affecting multiple users. When I first reported the problem, I appeared to be the first case—what you might call “patient zero.” I have provided extensive feedback and system logs, but now that the issue is more widespread, I have been told that I will not be informed of the cause or notified directly when a fix is found. Instead, updates will be released as solutions are identified, and support staff will not necessarily know the details of the underlying problem. To summarize my experience: after enabling Voice Control and rebooting my MacBook Pro (14.2-inch, M4 chip), critical Apple system services—including FaceTime, Apple Music, and News—stop functioning. Dictation remains available, but it is not as accurate or effective for my needs as Voice Control. I rely on these accessibility features daily due to my disability and cerebral palsy, and this issue has persisted for over five months. I have always valued contributing to the developer program and supporting Apple’s efforts to improve accessibility. However, I find it discouraging that there is no clear communication about the status of this issue or its resolution. My theory is that there may be a hardware interaction—perhaps between the neural engine and the new Wi-Fi chip—rather than a purely software problem. I understand that some information may not be immediately available, but I believe that users who rely on accessibility features should be kept informed about major issues and their progress toward resolution. I appreciate the dedication of the accessibility and development teams, and I want to continue supporting Apple’s mission of inclusion. Thank you for your attention to this matter. Sincerely, Donald Spencer Kirby Dayton, Ohio
2
0
179
Jun ’25
Handling Keyboard Hotkeys and Shortcuts across Multiple Languages
We have a requirement to manage the shortcuts and hotkeys in our application, and have it to be intuitive and support multi-lingual fully. The understanding that we have currently is that most universal shortcuts and hotkeys on MacOS/iOS are expressed using English/Latin characters’ – and now, when a ‘pure foreign language physical or virtual keyboard’ is the ‘input device’ – we are unclear how the user would invoke such a hotkey. Now, considering cases where other language keyboards have no Latin characters, in these environments, managing shortcuts and hotkeys becomes a rather difficult task. Taking a very simple example, the shortcut for Printing a page is Command/Control + 'P'. This can be an issue on Non English character keyboards like Arabic, where not only are there no letters for P, there is also no equivalent phonetic character as well, since the language itself does not have it. Also – when we are wanting customizability of a hotkey by the user, how would the user express ‘which is the key combination for a given action they want to perform’. So, based on these conditions, in order to provide the most comprehensive and optimal experience for the user in their own language, what is it that Apple recommend we do here, for Hotkeys/Shortcuts support in Pure Languages
1
0
418
2w
Accessibility full keyboard access issue.
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
2
0
554
Mar ’25
Issue with “Vocal Shortcuts” disabling after one day of use
Hello Apple Community, I'm experiencing an issue with "Vocal Shortcuts" on iOS. I created a trigger in Vocal Shortcuts to run a specific shortcut, and it works perfectly on the first day. However, by the next day, the voice command stops functioning entirely. To make it work again, I have to disable and then re-enable Vocal Shortcuts in the settings. I've tested this on multiple devices (iPhone 11, iPhone 13, and iPhone X), all running the latest iOS version, and the same problem occurs on each one. Is there any additional configuration needed, or could this be a bug? Any advice or insights would be greatly appreciated! Thank you in advance,
2
0
666
Jan ’25
AVSpeechSynthesisProviderVoice audioFileSettings field
Hello, the AVSpeechSynthesisVoice has a audioFileSettings attributes let utterance = AVSpeechUtterance(string: text) utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!) print("- voice \(utterance.voice!.audioFileSettings)") ["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32] This is declared in AVSpeechSynthesisVoice { ... @available(iOS 13.0, *) open var **audioFileSettings:** [String : Any] { get } @available(iOS 17.0, *) open var voiceTraits: AVSpeechSynthesisVoice.Traits { get } } How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ? Cause in AVSpeechSynthesisProviderVoice there is no such field AVSpeechSynthesisProviderVoice { open var name: String { get } open var identifier: String { get } open var primaryLanguages: [String] { get } open var supportedLanguages: [String] { get } open var voiceSize: Int64 open var version: String open var gender: AVSpeechSynthesisVoiceGender open var age: Int } Regards
2
0
139
Mar ’25