Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Is there any way to make user forced update?
Hi I'm planning to make macos App and distribute to MacOS App Store. The question is should i make force update when update is needed. The reason why I want to make this feature is I don't want to make user use previous version of app. My plan is like this. when app needed update, make user reach special page that describe why update is needed and set a button that can download new version of app. the download will be automatically doing at background don't need to visit app store. I search several forums and gpt but there is no positive reply of this.. so finally i make a post to know is there no way to make this. Thank you!
1
0
472
Mar ’25
Making PhotoLibrary UIImagePickerController a11y compliant
I am invoking the UIImagePickerController of type UIImagePickerControllerSourceTypePhotoLibrary from my viewController. I want shift the keyboard focus to the Cancel button which is the first interactive element on the gallery picker. When a user has full keyboard access turned on they should be able to tap tab and interact with the gallery picker modal. How do I achieve this?
1
0
153
May ’25
VoiceOver Accessibility Tree out of sync with WKWebView contents
Hey, We've run into an issue where WKWebView contents are not always available for VoiceOver users. It seems to occur when WKWebView contents are loaded asynchronously. I have a sample project where this can be reproduced and a video showing the issue. See FB21257352 The only solution we currently see is forcing an update continuously using UIAccessibility.post(notification: .layoutChanged, argument: nil), but this is ofc a last resort as it may have other unintended side effects.
1
0
453
3w
Why is VoiceOver’s "Content Chooser" rotor empty in my macOS app?
I'm developing a macOS app using NSView and trying to make my content navigable via VoiceOver. I'm expecting the built-in rotor category "Content Chooser" (accessed via VO + U) to list my accessible elements — just like how it shows message items in the Mail app. However, in my app, this rotor appears empty, even though: My views return proper accessibilityChildren() or accessibilityContents() with valid NSAccessibilityElements Each child has correct AXRole, AXLabel, etc. The window is key and visible VoiceOver navigation works for the elements I've also tried: Using both accessibilityChildren() and accessibilityContents() in container views Setting roles like .group, .staticText, .button, etc. Avoiding hidden elements Ensuring all elements are visible and labeled Still, "Content Chooser" rotor is empty. What exact conditions must be met for an element to appear in the "Content Chooser" rotor in a macOS app? Any Apple-specific guidance, hidden requirements, or sample code would be appreciated.
1
0
175
May ’25
Handling Keyboard Hotkeys and Shortcuts across Multiple Languages
We have a requirement to manage the shortcuts and hotkeys in our application, and have it to be intuitive and support multi-lingual fully. The understanding that we have currently is that most universal shortcuts and hotkeys on MacOS/iOS are expressed using English/Latin characters’ – and now, when a ‘pure foreign language physical or virtual keyboard’ is the ‘input device’ – we are unclear how the user would invoke such a hotkey. Now, considering cases where other language keyboards have no Latin characters, in these environments, managing shortcuts and hotkeys becomes a rather difficult task. Taking a very simple example, the shortcut for Printing a page is Command/Control + 'P'. This can be an issue on Non English character keyboards like Arabic, where not only are there no letters for P, there is also no equivalent phonetic character as well, since the language itself does not have it. Also – when we are wanting customizability of a hotkey by the user, how would the user express ‘which is the key combination for a given action they want to perform’. So, based on these conditions, in order to provide the most comprehensive and optimal experience for the user in their own language, what is it that Apple recommend we do here, for Hotkeys/Shortcuts support in Pure Languages
1
0
417
2w
kAXSelectedTextChangedNotification not received after restart, until launching Accessibility Inspector
I'm facing a bizarre issue with the Apple's Accessibility APIs. I am registering an AXObserver that listens for, among other things, the kAXSelectedTextChangedNotification. For many new users, the kAXSelectTextChangedNotification is not triggered, even though they have enabled Accessibility permission for the app. Other notifications are getting through (kAXWindowMovedNotification, kAXWindowResizedNotification, kAXValueChangedNotification etc - full list here), just not the kAXSelectedTextChangedNotification! We've found that we can reproduce the error by removing accessibility permission for the app and rebooting our computers. After restarting and reenabling accessibility permissions, the kAXSelectedTextChangedNotification was not received, even though other notifications were fine. Strangely, the issue can be resolved by launching Apple's Accessibility Inspector app on an impacted computer. Once the Accessibility Inspector is loaded, the kAXSelectedTextChangedNotifications start coming through as expected. This implies to me that either: We are missing some needed setup when starting the observers. Accessibility Inspector gets it right, thus ‘starting’ the system properly. Accessibility Inspector is using some Apple private APIs that we don’t have access to. Things I’ve tried: I've tried subscribing the AXSelectedTextChangedNotification to different AXUIElements, including the SystemWide element, the Application element, and children elements from the AXApplication. None of these received the kAXSelectedTextChangedNotification, until Accessibility Inspector is booted up. No surprises here, as Apple's documentation confirms that you should add the notification to the root Application AXUIElement if you want to receive notifications for all its children. I had a theory that the issue might be due to my code calling AXUIElementCreateApplication multiple times, possibly creating multiple "Applications" in Apple's Accessibility implementation. If that’s the case, the notifications might be sent to the wrong application AXUIElement. However, refactoring my code to only call AXUIElementCreateApplication once didn't resolve the issue. I thought the issue may be caused by subscribing the AXSelectedTextChangedNotification on the high-level application element (at odds with Apple's documentation). I've tried traversing the child AXUIElements until we find one with the kAXSelectedTextAttribute and then subscribing to that. This did not resolve the issue. I don’t think it's the correct path to continue exploring, given that the notifications are received correctly after AccessibilityInspector is launched. There is one exception to the above: if I add the kSelectedTextChangedNotification listener to a specific text field AXUIElement, I do receive the notification on that text field. However, this is not practical; I need a solution that will work for all text fields within an app. The Accessibility Inspector appears to be doing something that causes the selected-text-changed notifications to be correctly passed up to the high-level application AXUIElement. Another thought is that I could traverse the entire Accessibility hierarchy and add listeners to every subview that has the kAXSelectedTextAttribute. However, I don’t like this long-term solution. It will be slow and incomplete: new elements get added and removed frequently. I just want the kAXSelectedTextChangedNotification to be received by the high-level Application AXUIElement, which the documentation suggests it should be. I also have evidence that this can work, since notifications start coming through after Accessibility Inspector is launched. It’s just a matter of discovering how to replicate whatever Accessibility Inspector is doing. An interesting wrinkle: I implemented the 'traverse' strategy above, but was surprised by how few elements were in the hierarchy. Most apps only go down ~2-3 levels, which didn't seem right to me. Perhaps the Accessibility tree isn't fully initialized? I tried adding a 5-second delay to allow more initialization time, but it didn't change anything. Does anyone have any ideas? Here's our file.
1
1
147
May ’25
ar quicklook suddenly is grayed out on iphone 15 pro
ar quicklook suddenly is grayed out on iphone 15 pro, I bought the phone new recently ot was working great, 2 days ago updated to ios 18.1.4, ar mode kept opening but i started getting a move iphone over surface message and the object wouldn’t detect surfaces correctly, updated to ios 18.5, now when i open quicklook modesl ar is completely greyed out, can someone help me fix or detect the issue thank you
1
1
180
May ’25
Accessibility IDs showing up in Accessibility Inspector, but automated testing script is unable to find them
In the app I'm working on, I have a SwiftUI View embedded in a UIKit Storyboard. The SwiftUI View holds a menu with a list of payment tools, and the ForEach loop looks like this: ForEach(self.paymentToolsVM.paymentToolsItems, id: \.self) { paymentTool in Button { navigationCallback(paymentTool.segueID) } label: { PaymentToolsRow(paymentToolName: paymentTool.title, imageName: paymentTool.imageName) .accessibilityElement() .accessibilityIdentifier("Billing_\(paymentTool.title.replacingOccurrences(of: " ", with: ""))") } if paymentTool != self.paymentToolsVM.paymentToolsItems.last { Divider() } } So you can see the accessibility ID is there, and it shows up properly when I open up Accessibility Inspector with the simulator, but the testing script isn't picking up on it, and it doesn't show up when the view is inspected in Appium. I have other SwiftUI views embedded in the UIKit view, and the script picks up the buttons on those, so I'm not sure what's different about this one. If it helps, the script is written in Java with the BDD framework. I can try to get the relevant part of the script if anyone thinks that would be helpful. Otherwise, is there anything else I can try?
1
0
160
May ’25
Siri misreads local currency in notifications (Bug reported, still unresolved)
I’m experiencing an issue where Siri incorrectly announces currency values in notifications. Instead of reading the local currency correctly, it always reads amounts as US dollars. Issue details: My iPhone is set to Region: Chile and Language: Spanish (Chile). In Chile, the currency symbol $ represents Chilean Pesos (CLP), not US dollars. A notification with the text: let content = UNMutableNotificationContent() content.body = "¡Has recibido un pago por $5.000!" is read aloud by Siri as: ”¡Has recibido un pago por 5.000 dólares!” (English: “You have received a payment of five thousand dollars!”) instead of the correct: ”¡Has recibido un pago por 5.000 pesos!” (English: “You have received a payment of five thousand pesos!”) Another developer already reported the same issue back in 2023, and it remains unresolved: https://developer.apple.com/forums/thread/723177 This incorrect behavior is not limited to iOS notifications; it also occurs in other Apple services: watchOS, iPadOS, and macOS (Siri misreads currency values in various system interactions). Siri’s currency conversion feature misinterprets $ as USD even when the device is set to a region where $ represents a different currency. Announce Notifications on AirPods also exhibits this issue, making it confusing when Siri announces transaction amounts incorrectly. Apple Intelligence interactions are also affected—for example, asking Siri to “read my latest emails” when one of them contains a monetary value results in Siri misreading the currency. I have submitted a bug report via Feedback Assistant, and the Feedback ID is FB16561348. This issue significantly impacts accessibility and localization for users in regions where the currency symbol $ is not associated with US dollars. Has anyone found a workaround, or is there any update from Apple on this?
1
1
677
Feb ’25
Problem with the note application
I have more than 1000 notes classified in parent/child folders up to 5 levels. From the 5th level of files I can no longer share the note. The note is not shared. It is that of the parent file that is shared. Thank you very much Good to you Christophe
1
1
240
May ’25
USB-C Cable for Galaxy Watch 7 not recognized by Macbook pro M4
I get it, "Why don't you just get an apple watch?" regardless, My Macbok Pro M4 doesn't recognize my charging cable in any of my USBC ports. The Cable works with any other power supply I plug it into, but the Macbook doesn't even register that a cable is connected. -- Running Sequioa Beta 15.4 thinking it may have been software related. No change. -- Settings> Privacy and Security> accessories> Changed between all available options. No change. -- Option + Apple Logo> system info> Thunderbolt/USB4> none of the ports show that the cable is connected. -- Any other USB-C Cable is recognized in any of the ports on the Macbook. Just not the cable for the Galaxy Watch. Again, the cable charges in ANY other USBc ports from ANY other device I connect it to. Am I missing something? Or is this an intended jab at Samsung from Apple? lol
1
0
732
Mar ’25
Voice-over mode issues
Added a view controller in the storyboard, added a tableview in this view, and added a cell under the table, when I run the APP to jump to the page, when using the narration function, I find that when I use three fingers to swipe up or down, a sentence will be broadcast in English, I want to no longer change the accessiblity of the cell, when I perform the behavior of swiping up or down with three fingers, Broadcast how Chinese should be implemented.
1
0
75
Jun ’25
Accessibility for detents behaves different in fullscreen cover
The only way I found to make the accessibility focus work correctly in the detent in a fullscreen cover is to apply the focus manually. The issue is in the ContentView the grabber works while in the fullscreen it does not. Is there something I am missing or is this a bug. I also don't understand why I need to apply focus in the fullscreen cover while in the ContentView I do not. struct ContentView: View { @State private var buttonClicked = false @State private var bottomSheetShowing = false var body: some View { NavigationView { VStack { Button(action: { buttonClicked = true }, label: { Text("First Page Button") .padding() .background(Color.blue) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("First Page Button") FullscreenView2() } .navigationTitle("Welcome") .fullScreenCover(isPresented: $buttonClicked) { FullscreenView(buttonClicked: $buttonClicked, bottomSheetShowing: $bottomSheetShowing) } } } } struct FullscreenView: View { @Binding var buttonClicked: Bool @Binding var bottomSheetShowing: Bool var body: some View { NavigationView { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") .toolbar { ToolbarItem(placement: .navigationBarLeading) { Button(action: { buttonClicked = false }, label: { Text("Close") }) .accessibilityLabel("Close Fullscreen View Button") } } .accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } } struct FullscreenView2: View { @State var bottomSheetShowing = false var body: some View { VStack { Button(action: { bottomSheetShowing = true }, label: { Text("Show Bottom Sheet") .padding() .background(Color.green) .foregroundColor(.white) .cornerRadius(8) }) } .accessibilityHidden(bottomSheetShowing) .navigationTitle("Fullscreen View") //.accessibilityHidden(bottomSheetShowing) .onChange(of: bottomSheetShowing, perform: { _ in }) .sheet(isPresented: $bottomSheetShowing) { if #available(iOS 16.0, *) { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) .presentationDetents([.medium, .large]) } else { BottomSheetView(bottomSheetShowing: $bottomSheetShowing) } } } } struct BottomSheetView: View { @Binding var bottomSheetShowing: Bool // @AccessibilityFocusState var isFocused: Bool var body: some View { VStack(spacing: 20) { Text("Bottom Sheet") .font(.headline) .accessibilityAddTraits(.isHeader) Button(action: { bottomSheetShowing = false }, label: { Text("Dismiss") .padding() .background(Color.red) .foregroundColor(.white) .cornerRadius(8) }) .accessibilityLabel("Dismiss Bottom Sheet Button") } .padding() .frame(maxWidth: .infinity, maxHeight: .infinity) .background( Color(UIColor.systemBackground) .edgesIgnoringSafeArea(.all) ) .accessibilityAddTraits(.isModal) // Indicates that this view is a modal // .onAppear { // // Set initial accessibility focus when the sheet appears // DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { // isFocused = true // } // } // .accessibilityFocused($isFocused) } }
1
1
614
Feb ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
1
0
294
Nov ’25
Double Payment Issue — Apple Developer Account Still Not Activated
Hi everyone, I completed my Apple Developer registration on November 8, 2025, and paid the fee. Afterwards, I was charged another €99, so I basically paid twice. But nothing has changed on my account so far — it doesn’t even show as “pending” or “waiting” anymore. I also can’t reach Apple Support, since I’m unable to get a call or send an email. Has anyone else experienced something like this or knows what I can do?
1
1
262
Nov ’25
AccessibilityHint for UIAlertAction
Hi, I am setting an accessibilityLabel and accessibilityHint property of a UIAlertAction. However, VoiceOver is only reading the label out. Usually, the label is read out, followed by a short pause and then the hint. Is this a known issue, where hints do not work for this element? I can append the hint to the label, but interested to know if there's something I'm doing wrong. Regards.
1
0
334
Mar ’25
Not able to receive silent pushes in background
I’ve developed the Pro Talkie app—a walkie-talkie solution designed to keep you connected with family and friends App Store: https://apps.apple.com/in/app/pro-talkie/id6742051063 Play Store: https://play.google.com/store/apps/details?id=com.protalkie.app While the app works flawlessly on Android and in the foreground on iOS, I’m facing issues with establishing connections when the app is in the background or terminated on iOS. Specifically, I’ve attempted the following: Silent pushes and alert payloads: These are intended to wake the app in the background, but they often fail—notifications may not be received or can be delayed by 20–30 minutes, leading to a poor user experience. VoIP pushes: These reliably wake the app, but they trigger the incoming call UI, which isn’t suitable for a walkie-talkie app that should connect directly without displaying a call screen. I’ve enabled all the necessary background modes (audio, remote notifications, VoIP, background fetch, processing), but the challenge remains. How can I ensure a consistent background connection on iOS without triggering the call UI?
1
0
445
Mar ’25