Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.

All subtopics
Posts under Accessibility & Inclusion topic

Post

Replies

Boosts

Views

Activity

Apple Developer Program Purchase Not Saving Progress
I’m trying to enroll in the Apple Developer Program as an individual. I’ve gone through the steps on the website and started the purchase process. However, after a couple of days when I return to the site, it doesn’t remember my progress — I have to start the enrollment from scratch every time. Is this expected behavior? Am I missing a step to save my progress or complete the enrollment properly? Any help or guidance would be appreciated. Thank you!
0
0
161
Jun ’25
多相机切换时画质参数差异显著
切换后两者的亮度、色彩饱和度、对比度等画质参数差距较大,导致画面视觉体验割裂,破坏直播连贯性,影响用户观看沉浸感。 期望 "· 对标常规直播单反相机的画质基准,优化 Vision Pro 的画面亮度、色彩还原能力; · 提供设备端或配套软件的画质自定义调节功能(亮度、对比度、色温等),支持直播前手动校准,确保与单反相机画面风格一致。"
0
0
93
1w
Clarification on Color Path Determination in Wallet Provisioning (Green,Yellow, Orange) Path recommendation
Hi, I’ve been reviewing the Apple Wallet provisioning documentation (Getting Started with Apple Pay In-App Provisioning_ Verification_Security_Wallet Extensions )and had a few questions regarding the color path recommendation (Green, Yellow, Orange, Red) returned during the in-app provisioning flow: Who determines the color path—is it Apple directly, the Payment Network Operator (PNO), or both? What criteria are used to determine the color path (e.g., device info, Apple ID reputation, past provisioning attempts)? At what point in the provisioning flow is the color path recommendation received? Is it included in the response after the PKAddPaymentPassRequest is submitted? Is it accessible through any specific property or callback in the delegate method? Additionally, for Orange Path with Reason Code 0G, I understand that in-app verification is not allowed and must be handled via tenured channels (e.g., SMS/email). Can you confirm if this logic still applies for requests initiated from within the issuer's iOS app? Would appreciate any clarification or pointers to related documentation.
0
0
121
May ’25
Allow Mobile Data switching
there is no possibility to sett the allow mobile Data switch I have the latest update but still does not work and I realised it when I went to another country and I could not sett my Mobile data and when I came back still I could not.
0
0
788
Sep ’25
How to lookup keybinding translation across input sources
I have an application that binds a menu item to trigger on ⌘]. When I set the US input source, I press ⌘] in order to trigger that item. However, when I switch the input source to QWERTZ (German), the trigger changes to ⌘Ä automatically by the OS. It seems to translate keystrokes for different input sources. The problem is that I also render the keybindings in a window in my application, and my application is not aware of this translation. Furthermore, I have other key shortcuts in my application which are not bound to menu items, and I want to make sure those get translated too. Does AppKit expose a way to lookup what a keystroke will be when MacOS translates it, i.e. lookup ⌘Ä from ⌘] when the current layout is QWERTZ? I can't find anything in Apple's docs. I tried converting a character to virtual key code based on the US layout and then mapping it back to a character based on the QWERTZ layout. That doesn't seem to be the same b/c that ends up converting ] to + instead which seems to be based on physical key location, different from how the keybindings are handled. Update: I notice similar behavior for VS Code's menu bar, e.g. in their "Terminal" menu. Switching to German changes some bindings. This does not occur at all in iTerm's menu bar, I suspect b/c their menu items are specified in a different way, xib files with hard-coded key equivalents
0
0
441
Jan ’25
Accessibility Voiceover is not treating navigation bar left button as first focused element
Accessibility Voiceover is not treating navigation bar left button as first focused element. If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title. If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI. how should I do? if I want to focus on the navigation item back button or navigation title. my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange. Why did you design it this way? Can you tell me your thinking? Thanks
0
0
380
Sep ’25
Clarification on Entitlements, Privacy Manifest, and Info.plist for System-Wide Mouse Click Monitoring and Typing Simulation in macOS App
I am currently developing a macOS application that listens for system-wide mouse clicks to simulate typing with user-provided text. The app requires Accessibility permissions to function properly, and I want to ensure compliance with Apple’s latest privacy and security guidelines. The app listens to global mouse clicks. It simulates keyboard input with user-provided text I would like detailed guidance on the following aspects: What specific entitlements are required to allow system-wide mouse click monitoring and simulating user input ? App Sandbox enable or disable? what keys required to explain global mouse click monitoring and keyboard input simulation in the info.plist What will be the configuration of Privacy Manifest
0
0
476
Jan ’25
Why does the macOS window sharing indicator appear for some windows but not others?
On recent versions of macOS, when a window is being shared (via the system screen-capture APIs), the OS sometimes shows a small "shared window" badge in the title bar. I’ve noticed that this indicator is not consistent: For some windows, the badge reliably appears when they are being shared. For other windows, the badge never appears, even though the window is actively shared. In particular, windows that use a standard system title bar seem to show the indicator more often, while windows with custom-drawn or non-standard chrome do not. My questions are: What are the exact conditions under which macOS decides to draw the “shared window” indicator in a window’s title bar? Is this strictly tied to certain NSWindow styles or masks (e.g. titled vs borderless)? Is there any API or flag I can use to detect programmatically whether a given window will display this system indicator when shared?
0
0
1.1k
Sep ’25
Custom prediction panel not working in Google Docs
I’m working on a macOS Accessibility setup for a French-speaking user and I’ve hit a wall. (I'm not a developper and I'm trying to help my kid with dyslexia) I successfully built a custom word prediction panel using the Panel Editor (Keyboard) in macOS Accessibility > Keyboard > Accessibility Keyboard. Here’s what I have so far: • The prediction panel works system-wide: I can use it to type in Finder, Safari, Notes, TextEdit, and even browser search bars. • The panel appears above all applications and suggestions show up correctly. • However, it does not work inside Google Docs (tested in Chrome, Safari, and Firefox). Selecting a word from the panel does nothing in the Docs editor. I suspect this is because: • Google Docs does not use a standard macOS text input field. • Docs is a web app that relies on custom JavaScript editors, contentEditable elements, and canvas rendering, so macOS Accessibility APIs (AXTextField, AXInsertText, etc.) don’t register or inject text events. • Accessibility tools like the Accessibility Keyboard rely on native macOS text input methods, which don’t hook into Google Docs’ custom editor. Important: I’m not a programmer. I’d like to know if there is an easy fix or option in macOS, Google Chrome, or Google Docs that would make my custom prediction panel work, before going into custom development. Technical setup: • MacBook Air (M2, 2022) • RAM: 8 GB • macOS: Sequoia 15.3.1 • Language: French (system and keyboard) • Accessibility Keyboard: Enabled via Settings > Accessibility > Keyboard • Custom panel: Built using Panel Editor (Keyboard), named “Philemon Prédiction” • Browsers tested: Chrome, Safari, Firefox (same issue) • Behavior: Panel is visible, suggestions appear, but inserting text does nothing in Google Docs Has anyone worked around this limitation? Is there a simple setting, workaround, or accessibility option to bridge macOS Accessibility input with Google Docs’ editor? Thanks a lot!
0
1
934
Aug ’25
SwiftUI Full keyboard access doesn't navigate through every button on screen
I have screen in my app that can represented by following layout, I would like this screen to be possible to navigate with full keyboard access but there is unexpected behavior: Path: Tap "Tab" on keyboard -> whole scrollview is targeted and inside the first button1 is selected. Arrow down -> selection changes to button3 Arrow up -> selection changes back to button1 So button2 is always skipped, there is no way to navigate to it by arrows left/right. Using Tab+F and searching "button2", button2 is correctly selected, so it's selectable but for some reason not findable by going through elements. Putting empty text in Text views cause buttons to be vertically aligned and then everything works correctly but it is not an option. public struct BugReportView: View { public var body: some View { ScrollView { VStack(spacing: .zero) { Button("button1", action: { }) HStack { Text("some text") Text("some text2") Button("button2", action: { }) } Button("button3", action: { }) } } } }
0
3
198
May ’25
VoiceOver is not respecting lang in HTML option
I have an HTML select that has Spanish text in the options. When VoiceOver reads the selected option (unopened), it switches to Spanish as expected. However, when you open the select box and browse through the options, it uses the English voice to read the Spanish text. I have tried adding lang on to the select tag and the option tag but neither helps https://codepen.io/grahamfowles/pen/VYYRxMK
0
0
137
May ’25
ARKit Eye Tracking Calibration Issues - Word-Level Reading Tracking Feasibility
Hi Apple Developer Community, I'm developing an eye-tracking application using ARKit's ARFaceTrackingConfiguration and ARFaceAnchor.blendShapes for gaze detection using Xcode. I'm experiencing several calibration and accuracy issues and would appreciate insights from the community. Current Implementation Using ARFaceAnchor.blendShapes (.eyeLookUpLeft, .eyeLookDownLeft, .eyeLookInLeft, .eyeLookOutLeft, etc.) Implementing custom sensitivity curves and smoothing algorithms Applying baseline correction and coordinate mapping Using quadratic regression for calibration point mapping Issues I'm Facing 1. Calibration Mismatch Red dot position doesn't align with where I'm actually looking Significant offset between intended gaze point and actual cursor position Calibration seems to drift or become inaccurate over time 2. Extreme Eye Movement Requirements Need to make exaggerated eye movements to reach screen edges/corners Natural eye movements don't translate to proportional cursor movement Difficulty reaching certain screen regions even with calibration 3. Sensitivity and Stability Issues Cursor jitters or jumps around when looking at center Too much sensitivity to micro-movements Inconsistent behavior between calibration and normal operation 4. I also noticed that tracking on calibration screen as well as tracking on reading screen works better as expected when head movement is there, but I do not want much head movement. I want tracking with normal eye movement while reading an Ebook. Primary Question: Word-Level Eye Tracking Feasibility Is word-level eye tracking (tracking gaze as users read through individual words in an ebook) technically feasible with current iPhone/iPad hardware? I understand that Apple's built-in eye tracking is primarily an accessibility feature for UI navigation. However, I'm wondering if the TrueDepth camera and ARKit's eye tracking capabilities are sufficient for: Tracking natural reading patterns (left-to-right, line-by-line progression) Detecting which specific words a user is looking at Maintaining accuracy for sustained reading sessions (15-30 minutes) Working reliably across different users and lighting conditions Questions for the Community Hardware Limitations: Are iPhone/iPad TrueDepth cameras capable of the precision needed for word-level tracking, or is this beyond current hardware capabilities? Calibration Best Practices: What calibration strategies have worked best for accurate gaze mapping? How many calibration points are typically needed? Reading-Specific Challenges: Are there particular challenges when tracking reading behavior vs. general gaze tracking? Alternative Approaches: Are there better approaches than ARKit blend shapes for this use case? Current Setup Devices: iPhone 14 Pro iOS Version: iOS 18.3 ARKit Version: Latest available Any insights, experiences, or technical guidance would be greatly appreciated. I'm particularly interested in hearing from developers who have worked on similar eye tracking applications or have experience with the limitations and capabilities of ARKit's eye tracking features. Thank you for your time and expertise!
0
0
704
Oct ’25
拍摄画面亮度不稳定(动态波动)
画面亮度存在无规律动态波动(时亮时暗),且无手动控制入口,导致商品颜色还原失真、主播面部曝光异常(过曝 / 欠曝),严重影响直播展示效果。 期望 "· 优化直播模式的自动曝光算法,提升复杂光线环境下的亮度稳定性; · 增加 “直播模式” 专属亮度锁定功能,支持手动设定亮度参数并锁定,满足直播场景下的画质可控需求。 "
0
0
134
1w
Focus issues with ScrollView iOS18
When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard. 1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard. TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView. Is this a known iOS18 issue with ScrollView / any tip to get this fixed ? Sample code that can reproduce this issue: struct ContentView: View { @State private var showBottomSheet: Bool = false @State private var goToNextView: Bool = false @FocusState private var focused: Bool @AccessibilityFocusState private var voFocused: Bool var body: some View { NavigationView { VStack { Text("Hello, world!") // This button works fine in Bluetooth keyboard in all versions Button("Trigger a bottomsheet") { showBottomSheet = true } .focused($focused) .accessibilityFocused($voFocused) Button("Goto another view") { goToNextView = true } NavigationLink( destination: View2(), isActive: $goToNextView ) { EmptyView() } .accessibility(hidden: true) } .sheet(isPresented: $showBottomSheet, onDismiss: { focused = true voFocused = true }, content: { VStack() { Text("Hello World ! I'm in a bottomsheet") Button("Close me") { showBottomSheet = false } } }) .padding() } } } #Preview { ContentView() } struct View2: View { @FocusState private var focused: Bool @AccessibilityFocusState private var voFocused: Bool @State private var showBottomSheet: Bool = false var body: some View { ScrollView { VStack { Text("check") // In iOS18, this button doesn't get focused in Bluetooth / external keyboard // This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink Button("Trigger a bottomsheet") { showBottomSheet = true } .focused($focused) .accessibilityFocused($voFocused) Button("Test button") { } } .sheet(isPresented: $showBottomSheet, onDismiss: { focused = true voFocused = true }, content: { VStack() { Text("Hello World ! I'm in a bottomsheet") Button("Close me") { showBottomSheet = false } } }) .padding() } } }
0
1
592
Feb ’25