Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

iPod comeback with 5G and more
hello everybody, I have a new idea for the future a new product the new iPod with integrated mobile router and with more functions also other products in this category more functions you can thinking for gaming and more one function is special for handheld controller. One is Nintendo Switch or other consoles. She have problem with the Wi-Fi. When you most login we are website. This problem is with this problem. Not more a problem. This is the idea one from thousand ideas the best is the software you can use this for working and for the children for the first smartphone you cannot calling all person you can only call the 911 or the family interest you making with FaceTime this is the number one for communication on this iPod. This is a new category and you can music streaming.
2
0
297
1w
DUNS Number
Hello, I’m in the process of enrolling my business (Carzo Rent A Car, Prishtine, Kosovo) in the Apple Developer Program, but I have been waiting for my D-U-N-S number to be issued. I submitted the request to Dun & Bradstreet on July 28, 2025 (Case #9142648) and have only received a system-generated email with a tracking ID (#9086421). There has been no further update. My questions are: Is there a way for Apple to expedite or provisionally approve my enrollment while the D-U-N-S number is pending? How long does Apple typically wait for D&B updates before the enrollment is affected? Are there any alternative steps I can take to avoid further delays? Thank you for your guidance.
2
0
175
Jul ’25
Developer Program enrollment still pending after payment
Hi everyone, I enrolled in the Apple Developer Program on the evening of December 26, 2025, and the membership fee has already been successfully charged to my bank account. However, my account is still showing a “Pending” status with the message “Subscribe your membership.” At this point, some time has passed and I haven’t received a confirmation email or any follow-up requesting additional information.
1
0
328
3d
SwiftUI Accessibility Inspector?
Please excuse me if this is obvious. I'm new to Apple development. Is there a SwiftUI Accessibility Inspector? I run the standard one, in Xcode 26b3, and it shows me warnings for things that I didn't create in SwiftUI. I presume that "SwiftUI" is primarily implemented using macros and that these things are either generated or boilerplate lower-level things. But if so, then why would they trip Accessibility Inspector warnings? Is there something I can do from SwiftUI to clear them? Or... is there a demangler somewhere that will translate from these names into something this human might recognize? I'm targeting macos, btw, if that makes any difference.
1
0
1.3k
Jul ’25
.
I have a Twitter account that I registered with Apple id and I still don't know the PIN and I'm having a problem with it knowing the PIN I need help privaterelay.appleid.com
1
0
259
Feb ’25
pairedUUIDsDidChangeNotification never fires, even with MFi hearing aids paired
Hi everyone — I’m implementing the new Hearing Device Support API described here: https://developer.apple.com/documentation/accessibility/hearing-device-support I have MFi hearing aids paired and visible under Settings → Accessibility → Hearing Devices, and I’ve added the com.apple.developer.hearing.aid.app entitlement (and also tested with Wireless Accessory Configuration: https://developer.apple.com/documentation/bundleresources/entitlements/com.apple.external-accessory.wireless-configuration ). com.apple.developer.hearing.aid.app xxxxx but the app won't even compile with this entitlement Problem NotificationCenter.default.addObserver(...) for pairedUUIDsDidChangeNotification never fires — not on app launch, not after pairing/unpairing, and not after reconnecting the hearing aids. Because the notification never triggers, calls like: HearingDeviceSession.shared.pairedDevices always return an empty list. What I expected According to the docs, the notification should be posted whenever paired device UUIDs change, and the session should expose those devices — but nothing happens. Questions Does the hearing.aid.app entitlement require special approval from Apple beyond adding it to the entitlements file? Is there a way to verify that iOS is actually honoring this entitlement? Has anyone successfully received this notification on a real device? Any help or confirmation would be greatly appreciated.
1
0
511
3w
iOS18.3.1+ widget: Local color picture load widget crashes
Environment:xcode 16.2 WidgetKit: Image(uiImage: UIImage(named: "jp_jump")!).resizable().scaledToFit().frame(width: 58, height: 16).padding(EdgeInsets(top: 0, leading: 16, bottom: 0, trailing: 0)) ”jp_jump“: Local color picture load widget crashes info: Thread 4: EXC_RESOURCE (RESOURCE_TYPE_MEMORY: high watermark memory limit exceeded) (limit=30 MB)
3
0
113
Mar ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
557
Dec ’25
SwiftUI tvOS Accessibility VoiceOver - prevent reading all items in ScrollView over and over
Hi, I'm trying to fix tvOS view for VoiceOver accessibility feature: TabView { // 5 tabs Text(title) Button(play) ScrollView { // Live LazyHStack { 200 items } } ScrollView { // Continue watching LazyHStack { 500 items } } } When the view shows up VoiceOver reads: "Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack. VocieOver should only read "Home tab 1 of 5" When moving focus to scroll view it reads: "Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to second item it reads: "Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to third item it reads: "Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4" It should be just reading what is focused, idealy just "Live, Item 1, 1 of 200" then after moving focus on item 2 "Item 2, 2 of 200" this time without the word "Live" because we are on the same scroll view (the same horizontal list) Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused. This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction. Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists. How do I disable reading content that is not focused? I tried: .accessibilityLabel(isFocused ? title : "") .accessibilityHidden(!isFocused) .accessibilityHidden(true) - tried on various levels in view hierarchy .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .contain) - tried on various levels in view hierarchy .accessiblityElement(children: .combine) - tried on various levels in view hierarchy .accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy .accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy // the last 2 was basically an attempt to hack it .accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view. 50+ other attempts at configuring accessibility tags attached to views. I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before. Any idea how to fix this? Thanks.
1
0
134
Apr ’25
IOHIDCheckAccess(kIOHIDRequestTypeListenEvent) does not work
I have an app that needs Input Monitoring permissions to get keyboard access in the background. I've attempted to use both IOHIDCheckAccess(kIOHIDRequestTypeListenEvent) and IOHIDRequestAccess(kIOHIDRequestTypeListenEvent), but they always return denied, even though I have given the permission for Input Monitoring to the app in Settings. Is there something I need to put in my Info.plist to enable this permission to work?
0
0
463
3w
BLE Device Not Appearing in Scan List on iOS After Name Change
I'm encountering an issue related to BLE device discovery on iOS. I have a BLE peripheral device that I initially connected to using an iOS device. After this connection, the BLE device's advertised name was programmatically changed by the peripheral. Now, when I try to scan for this device using other iOS devices, it does not appear in the scan results in most apps — including nRF Connect and our own custom BLE app that uses CoreBluetooth. A few observations: The device is definitely powered on and advertising (confirmed via Android). The name change is reflected correctly on Android and on the iOS device that originally connected to it. Other iOS devices no longer see the device in their scan list.
1
0
338
Jul ’25
Add VoiceOver touch gesture guidance for frame iframe in webView and Safari web
Please update Accessibility OS Settings for VoiceOver in iPhone iOS and iPadOS to include frames on the Rotor, and to make web navigation and component gestures easier to find and assign. Please add content to the iPhone and iPad Apple User Guide to use VoiceOver in web navigation with touch gestures. Specifically... iframes. There is no clear guidance in Apple documentation for VoiceOver users in iPhone or iPadOS to access iframes with touch gestures. A common belief as written on AppleVis, other blogs, and internet searches is that iframes in Safari or a webView in an app are only available with explore by touch. If explore by touch is the only option for some interactions, that needs to be included in Apple User Guides. If not, details on equivalent touch gestures for VO that have keyboard interactions in Mac need to be clear for users. VoiceOver for Mac includes a default keyboard interaction of VO-Command-F in its extensive User Guide (https://support.apple.com/guide/voiceover/by-images-or-frames-mchlp2740/mac). A user can include a rotor option for web navigation for iframes. VoiceOver for iPhone and iPad does not include a default swipe gesture assigned to frames. An option is not available for the Rotor. While there is iPhone User Guide guidance that gestures can be customized (https://support.apple.com/guide/iphone/customize-gestures-and-keyboard-shortcuts-iph59a8e6fd2/18.0/ios/18.0), it is not clear that for adding this gesture, "Move to the next frame" is tucked into the advanced navigation commands for VoiceOver Accessibility Settings in the OS. At least in my phone, the word "frame" was not searchable despite the All Commands screen using a search bar.
1
0
133
Apr ’25
VoiceOver accessibility issue in UIKit for line granularity
Context: We are using UIKit to provide accessibility in our app for our iOS users. Our app majorly contains documents/books that user can read. Issue: The issue is VoiceOver is skipping the lines given to it when there are some leading spaces in it. We have observed this issue in different languages. This is only happening for line granularity, other granularities seems to be working as expected. Implementation: We are using below API's to provide line content to voice over. UIAccessibilityReadingContent - accessibilityPageContent - accessibilityFrameForLineNumber - accessibilityContentForLineNumber We are creating UIAccessibilityElement objects to pass to VoiceOver and each UIAccessibilityElement implements UIAccessibilityReadingContent to provide readable content. We also use below APIs to cross element boundaries for all granular navigations. accessibilityNextTextNavigationElement accessibilityPreviousTextNavigationElement We want to know whether skipping the line when provided with leading spaces is expected or a bug in UIKit.
1
0
295
Nov ’25
SwiftUI List Accessibility VoiceOver
I have been working on a feature, where I have a List in SwiftUI with previous and next data loading, user can scroll up and down to load previous/next page data. Recently, I faced one accessibility issue while testing voice-over, when user lands on the listing screen and swipe across the screen from navigation and when focus comes on list it should highlight the first item visible. But when user swipes back: Should it load the previous data and announce the previous item or it should go back to the navigation items? If it loads the previous item, what if the user wants to go to the navigation to switch to other actions and vice-versa? Did anyone come across this kind of issue? What can be the standard expected behavior in this case if list has both previous and next page scroll? I different tried gestures https://support.apple.com/en-in/guide/iphone/iph3e2e2281/ios, but it isn't working
2
0
124
Apr ’25
Is it possible to animate the accessibility frame on iOS and macOS?
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
2
0
254
Jul ’25
Critical Bug: Children Can Disable Screen Time Apps Like Choreio Without Parental ApprovalI
Dear Apple Support, I am reporting a critical issue affecting parental control apps like my app, Choreio, which is live on the App Store. When Screen Time settings are configured to require a parent’s password for changes, parents must log in on their child’s device to make any adjustments. This restriction is expected to extend to apps using the Screen Time API, such as Choreio. However, I’ve discovered a significant bug: children can bypass this restriction by simply toggling off Choreio in the Screen Time settings—without needing the parent’s password. This effectively disables the app and defeats its purpose as a parental control tool. Please address this issue as soon as possible to ensure the intended functionality of parental controls. Let me know if you need any additional information to assist with resolving this. Thank you for your attention to this matter. Best regards, Jeff Houston STEPS TO REPRODUCE Here are the steps to reproduce the issue clearly: Install Choreio from the App Store on the child’s phone. Enable parental controls in Screen Time and set it to require the parent’s password for any changes to Screen Time settings. Go to the Screen Time settings on the child’s phone. Observe that the child can simply toggle off Choreio, effectively deactivating the app, without needing the parent’s password. Expected behavior: Toggling off Choreio should require the parent’s password, just like it does for other Screen Time settings. Let me know if additional details are needed!
4
0
358
Feb ’25