Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Seeking API Support for Marking Substrings as Headings in NSTextView for VoiceOver
I'm developing a document editor for macOS using AppKit, which supports structured content such as titles and multiple heading levels—similar to what you see in the Pages app. I'm looking for a way to programmatically mark a specific substring within an NSTextView as a heading, so that VoiceOver can recognize it and announce it appropriately (e.g., by saying “heading” before reading the text). This would be similar in spirit to how NSAccessibilityLinkTextAttribute works for links. Is there an existing accessibility text attribute or recommended approach to achieve this behavior for headings? If not, I’d appreciate any guidance or suggestions on how best to implement this in a VoiceOver-friendly way. Thank you in advance for your help! Best regards,
0
0
107
May ’25
accessibilityRespondsToUserInteraction return true on Simulator but false on device
I am seeing a strange issue where NSObject accessibilityRespondsToUserInteraction returns true on Simulator but false on device. Checking the same object on simulator with Accessibility inspector I see the object traits as image so why would it return true in that case? Are there any other way to check the the item might be accessibilityRespondsToUserInteraction OR Clickable beside that property and traits? (Or is it just another bug)
1
0
95
Jun ’25
Make Accessibility Focus move to UIPickerView when tapping on UITextField (Full Keyboard Access)
I have a UITextField in my application for entering a state. If I tap on it, a UIPickerView pops up and let's the user select a state (but they can still type too). The issue relates to Full Keyboard Access. If we select the UITextField using an external keyboard, the UIPickerView appears, but in order to get to it the user has to tab through the whole view controller to get to the UIPickerView at the end. What would be nice is to a) move focus directly to the UIPickerView (have it highlighted in blue and scrollable right away with keyboard) or b) make the UIPickerView the next view that's accessible when tabbing over or using the arrow keys. I've tried using: UIAccessibility notifications (both .screenChanged and .layoutChanged, with and without a delay). This ended up only announcing the view, but didn't help with full keyboard access. Making the UIPickerView a first responder when it appears. Attempting to change the accessibilityElements order (but with so many views and views within views, this isn't really a viable option either). Pressing tab + -> (tab and right arrow button) will quickly take the user to the end of the chain of accessibility elements, in other words, to the UIPickerView. But there has to be a cleaner way of just automatically setting the focus to the UIPickerView or making it the next element by pressing the arrow key.
0
0
397
Mar ’25
When using UIScrollView and UITextView together, inserting a link causes the following text to disappear.
Since UITextView does not support the zoom function, the zoom function of UITextView with addSubview is used in UIScrollView. However, when I use the link here, the text behind it is missing. Ex) https://appstoreconnect.apple.com/login\nApple Developer Login -> The text “Apple Developer Login” does not appear. If anyone has experienced the same problem as me or knows a solution, please leave a comment. Note) It is working normally in iOS16, but the text behind the link disappears in iOS18. The text is not visible, but you can copy it and paste it to retrieve the missing text.
0
0
245
Feb ’25
App in Unlisted Language
I am building a language learning app for a Unlisted Primary Language. Any suggestions or heads ups? My plan is to select english and go with it. Its unfortunate that I have to list a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
0
0
173
Jul ’25
Solo Developer User Feedback Avenues
I have a couple follow up questions after the "Accessibility technologies group lab". I know it was briefly mentioned that user feedback is an excellent way to grow inclusivity in the design an app and utilizing these forums were one for example. Is inviting folks here on the forum via test flight a reasonable approach to this for a solo developer? Are there other strategies, avenues, or examples to promote user feedback?
2
0
98
Jun ’25
BLE Device Not Appearing in Scan List on iOS After Name Change
I'm encountering an issue related to BLE device discovery on iOS. I have a BLE peripheral device that I initially connected to using an iOS device. After this connection, the BLE device's advertised name was programmatically changed by the peripheral. Now, when I try to scan for this device using other iOS devices, it does not appear in the scan results in most apps — including nRF Connect and our own custom BLE app that uses CoreBluetooth. A few observations: The device is definitely powered on and advertising (confirmed via Android). The name change is reflected correctly on Android and on the iOS device that originally connected to it. Other iOS devices no longer see the device in their scan list.
1
0
338
Jul ’25
SwiftUI tvOS Accessibility VoiceOver - prevent reading all items in ScrollView over and over
Hi, I'm trying to fix tvOS view for VoiceOver accessibility feature: TabView { // 5 tabs Text(title) Button(play) ScrollView { // Live LazyHStack { 200 items } } ScrollView { // Continue watching LazyHStack { 500 items } } } When the view shows up VoiceOver reads: "Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack. VocieOver should only read "Home tab 1 of 5" When moving focus to scroll view it reads: "Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to second item it reads: "Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to third item it reads: "Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4" It should be just reading what is focused, idealy just "Live, Item 1, 1 of 200" then after moving focus on item 2 "Item 2, 2 of 200" this time without the word "Live" because we are on the same scroll view (the same horizontal list) Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused. This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction. Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists. How do I disable reading content that is not focused? I tried: .accessibilityLabel(isFocused ? title : "") .accessibilityHidden(!isFocused) .accessibilityHidden(true) - tried on various levels in view hierarchy .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .contain) - tried on various levels in view hierarchy .accessiblityElement(children: .combine) - tried on various levels in view hierarchy .accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy .accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy // the last 2 was basically an attempt to hack it .accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view. 50+ other attempts at configuring accessibility tags attached to views. I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before. Any idea how to fix this? Thanks.
1
0
132
Apr ’25
ARKit Eye Tracking Calibration Issues - Word-Level Reading Tracking Feasibility
Hi Apple Developer Community, I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community. Current Implementation Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.) Implementing custom sensitivity curves and smoothing algorithms Applying baseline correction and coordinate mapping Using quadratic regression for calibration point mapping Issues I'm Facing 1. Calibration Mismatch Red dot position doesn't align with where I'm actually looking Significant offset between intended gaze point and actual cursor position Calibration seems to drift or become inaccurate over time 2. Extreme Eye Movement Requirements Need to make exaggerated eye movements to reach screen edges/corners Natural eye movements don't translate to proportional cursor movement Difficulty reaching certain screen regions even with calibration 3. Sensitivity and Stability Issues Cursor jitters or jumps around when looking at center Too much sensitivity to micro-movements Inconsistent behavior between calibration and normal operation 4. I also noticed that tracking on calibration screen as well as tracking on reading screen works better as expected when head movement is there, but I do not want much head movement. I want tracking with normal eye movement while reading an Ebook. Primary Question: Word-Level Eye Tracking Feasibility Is word-level eye tracking (tracking gaze as users read through individual words in an ebook) technically feasible with current iPhone/iPad hardware? I understand that Apple's built-in eye tracking is primarily an accessibility feature for UI navigation. However, I'm wondering if the TrueDepth camera and ARKit's eye tracking capabilities are sufficient for: Tracking natural reading patterns (left-to-right, line-by-line progression) Detecting which specific words a user is looking at Maintaining accuracy for sustained reading sessions (15-30 minutes) Working reliably across different users and lighting conditions Questions for the Community Hardware Limitations: Are iPhone/iPad TrueDepth cameras capable of the precision needed for word-level tracking, or is this beyond current hardware capabilities? Calibration Best Practices: What calibration strategies have worked best for accurate gaze mapping? How many calibration points are typically needed? Reading-Specific Challenges: Are there particular challenges when tracking reading behavior vs. general gaze tracking? Alternative Approaches: Are there better approaches than ARKit blend shapes for this use case? Current Setup Devices: iPhone 14 Pro iOS Version: iOS 18.3 ARKit Version: Latest available Any insights, experiences, or technical guidance would be greatly appreciated. I'm particularly interested in hearing from developers who have worked on similar eye tracking applications or have experience with the limitations and capabilities of ARKit's eye tracking features. Thank you for your time and expertise!
0
0
696
Oct ’25
Apple greets Global Accessibility Awareness Day with severe accessibility violations on macOS
I'm reposting here my FB17602742, submitted yesterday: The strong wording of this message comes from years of Apple ignoring the needs of users who can't tolerate UI animations and convulsions. At this point, it's clear that Apple is either intentionally harming users like me or simply doesn't care about meeting even the most basic accessibility standards on macOS. Yes, many UI animations and convulsions can, fortunately, be disabled - but not through straightforward UI controls. Instead, users are forced to look for obscure Terminal commands found scattered across the Internet. The "Reduce motion" checkbox in System Settings is simply a fake control that doesn't do anything - instead of actually disabling all UI animations and convulsions. What's worse, two of the most offensive UI animations cannot be disabled at all. Apple has consistently dismissed requests to let users disable the following UI animations: Scroll bar rollover highlight effect (introduced on macOS 10.7.3). Every time the cursor passes over a scroll bar, it gets highlighted. This draws the user's attention to random scroll bars for no reason - just because the cursor happened to pass over them. It results in HUNDREDS of unnecessary, annoying events of distraction daily!
 Expand/collapse animation of NSOutlineView (e.g., when opening/closing folders in the list view in the Finder, or any other app using outline views). This animation is extremely distracting, irritating, and time-wasting. Global Accessibility Awareness Day is approaching. Dear Apple, Please adhere to the most basic accessibility standards. Stop the needless suffering of countless users like me. Let us disable the two aforementioned UI convulsions. Thank you for your attention to the issue.
0
0
153
May ’25
RTT call option and confirmation dialog missing when dialing emergency numbers
Hello, In our app we provide a button that initiates a phone call using tel://. For normal numbers, tapping the button presents the standard iOS confirmation sheet with Call and Cancel. If RTT is enabled on the device, the sheet instead shows three options: Call, Cancel, and RTT Call. However, when dialing a national emergency number, this confirmation dialog does not appear at all — the call is placed immediately, without giving the user the choice between voice or RTT. Is this the expected system behavior for emergency numbers on iOS? 
And if so, how does RTT get applied in the emergency-call flow — is it managed entirely by the OS rather than exposed as a user-facing option? Thanks in advance for clarifying.
2
0
629
Sep ’25
Apple Developer Program Purchase Not Saving Progress
I’m trying to enroll in the Apple Developer Program as an individual. I’ve gone through the steps on the website and started the purchase process. However, after a couple of days when I return to the site, it doesn’t remember my progress — I have to start the enrollment from scratch every time. Is this expected behavior? Am I missing a step to save my progress or complete the enrollment properly? Any help or guidance would be appreciated. Thank you!
0
0
161
Jun ’25
Apple Vision Pro - Homonymous Hemianopia
Individuals with a stroke can end up with vision impairments: specifically Homonymous Hemianopia which basically means the individual has lost sight in (as an example) the left half of both eyes. I'm interested in understanding if it would be possible to help individuals with this vision impairment by providing an accessibility config within the Apple Vision Pro which would first determine an individuals field of view (possibly by showing a field of dots across the entire "screen" and having the individual look at the dot and click. Based on the results of this field of view, this would determine how the screen would be presented to the user moving forward. My mom (82 years old) had a stroke recently and was diagnosed with Homonymous Hemianopia. She lived on her IPhone and would love to get back the ability to text message, use Facebook, and order items from Amazon. Please advise if you believe the Apple Vision Pro would be capable of helping in this area with the suggested development, or other thoughts.
1
0
492
Jan ’25
App Store Connect – “Unable to Handle This Request” Error
Hello, I'm currently unable to access App Store Connect. When I try to open https://appstoreconnect.apple.com, I receive the following error message: “appstoreconnect.apple.com is currently unable to handle this request.” I’ve tried the following steps, but the issue persists: Cleared browser cache and cookies Tried different browsers (Safari, Chrome) Attempted from multiple devices and networks Is this a known issue or is there any workaround available? Would appreciate any help or update on the current status. Thank you,
0
0
107
Jun ’25
Verifying braille output in an iOS app without a physical braille device?
I'm developing a calculator app and working to ensure a great experience for both VoiceOver and Braille display users. For expressions like (2+3)×5, I need two different accessibility outputs: VoiceOver (spoken): A descriptive string like “left paren two plus three right paren times five,” provided via .accessibilityValue. I'm using a custom spellOut function since VoiceOver doesn't announce parentheses—which are kind of important when doing math! Braille (symbolic): The literal math string (2+3)×5, provided using .accessibilityCustomContent("", ...), with an empty label so it’s not spoken aloud. The issue: I don’t have access to a Braille display device and Xcode’s Accessibility Inspector doesn’t seem to show the custom content. Is there any way to confirm that custom Braille content is being set correctly in Simulator or with other tools? Or…is there a "math mode" in VoiceOver that forces it to announce parentheses? Any advice or workarounds would be much appreciated! Thanks, Uhl
8
0
323
Jul ’25