After IOS 26 beta 2 installation in my iphone 13, I can't do a screenshot using assistivetouch nor touch on back.
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hey all — hoping someone here has dealt with this before.
I’m testing an iOS app via TestFlight, and when I originally got access, I didn’t have an iPhone. So I signed in with my Apple ID on my girlfriend’s iPhone and used TestFlight there. Everything worked fine.
Now I finally have my own iPhone (iPhone 16), downloaded TestFlight, signed in with the same Apple ID, and had the developer resend the invite. But when I tap "Open in TestFlight" from the invite email, I get this error:
“Couldn’t load app because your Apple account has already been associated to this app.”
The dev tried removing me as a tester and re-adding me, I’ve deleted TestFlight from both phones, rebooted, reinstalled, waited in between — still no luck. Even tried opening the invite link in Safari instead of Mail.
Is there any way to get Apple to fully reset the association with the old device so I can use TestFlight on my new iPhone? Or do I really need to make a new Apple ID just to get around this?
Any help would be huge — thanks!
I am working on capturing 48MP images using the iPhone 16 Pro Max with the Ultra-wide camera. I’ve updated the code to capture the maximum supported dimensions with the following snippet:
if #available(iOS 16.0, *) {
photoOutput.maxPhotoDimensions = device.activeFormat.supportedMaxPhotoDimensions.last!
photoSettings.maxPhotoDimensions = .init(width: 5712, height: 4284)
}
However, I’m still not getting the expected results. My goal is to capture 48MP images, and I want to confirm if the Ultra-wide camera supports this resolution or if I’m missing any other configuration.
Any guidance would be appreciated!
home button cannot be pressed.
attempting to enable developer mode says "press home to continue"
the usual accessibility switch "assistive touch" that is shown on iOS is not available on this screen
so dev mode cannot be enabled
this is an accessibility issue
I have a question about Developer Mode on iPhone.
Currently, the home button on my iPhone SE (2nd generation) is broken, so I use AssistiveTouch to display a virtual home button. However, in Developer Mode, the virtual home button does not appear, making it impossible to enable Developer Mode.
Is there any way to enable Developer Mode in this situation?
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound?
Some transitions in Apple apps generally include different beep sounds, such as
opening a new screen
screen dimming
when a VoiceOver user swipes from the header / navbar to the body
a scraping sound when swiping up or down a page.
the beginning or end of the body section
in Calculator when swiping from one row to the next.
opening a pop up menu
I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
I can no longer specify more than 10 items to be shown in the Recent Items menu. The control to set "Recent documents, applications and servers" seems to have vanished from System Settings.
Topic:
Accessibility & Inclusion
SubTopic:
General
Haptic or Sound queue to allow for the accessibility of the blind (sound) and deaf population (haptic) for even knowing when location services and the camera were last used?
Also, the grey notification rather than the purple notification for location services should appear for the full 24 hours after an application has used the app, if the correct description is within the "copy" of Settings
The green light lets them know that the application has changed to the camera and fade out orange light both could even have subtle simply click sounds, like a
shutter, big haptic, softer sound, but editable in Settings, of course
Hey,
We've run into an issue where WKWebView contents are not always available for VoiceOver users. It seems to occur when WKWebView contents are loaded asynchronously.
I have a sample project where this can be reproduced and a video showing the issue. See FB21257352
The only solution we currently see is forcing an update continuously using UIAccessibility.post(notification: .layoutChanged, argument: nil), but this is ofc a last resort as it may have other unintended side effects.
I'm facing a bizarre issue with the Apple's Accessibility APIs. I am registering an AXObserver that listens for, among other things, the kAXSelectedTextChangedNotification. For many new users, the kAXSelectTextChangedNotification is not triggered, even though they have enabled Accessibility permission for the app. Other notifications are getting through (kAXWindowMovedNotification, kAXWindowResizedNotification, kAXValueChangedNotification etc - full list here), just not the kAXSelectedTextChangedNotification!
We've found that we can reproduce the error by removing accessibility permission for the app and rebooting our computers. After restarting and reenabling accessibility permissions, the kAXSelectedTextChangedNotification was not received, even though other notifications were fine.
Strangely, the issue can be resolved by launching Apple's Accessibility Inspector app on an impacted computer. Once the Accessibility Inspector is loaded, the kAXSelectedTextChangedNotifications start coming through as expected. This implies to me that either:
We are missing some needed setup when starting the observers. Accessibility Inspector gets it right, thus ‘starting’ the system properly.
Accessibility Inspector is using some Apple private APIs that we don’t have access to.
Things I’ve tried:
I've tried subscribing the AXSelectedTextChangedNotification to different AXUIElements, including the SystemWide element, the Application element, and children elements from the AXApplication. None of these received the kAXSelectedTextChangedNotification, until Accessibility Inspector is booted up. No surprises here, as Apple's documentation confirms that you should add the notification to the root Application AXUIElement if you want to receive notifications for all its children.
I had a theory that the issue might be due to my code calling AXUIElementCreateApplication multiple times, possibly creating multiple "Applications" in Apple's Accessibility implementation. If that’s the case, the notifications might be sent to the wrong application AXUIElement. However, refactoring my code to only call AXUIElementCreateApplication once didn't resolve the issue.
I thought the issue may be caused by subscribing the AXSelectedTextChangedNotification on the high-level application element (at odds with Apple's documentation). I've tried traversing the child AXUIElements until we find one with the kAXSelectedTextAttribute and then subscribing to that. This did not resolve the issue. I don’t think it's the correct path to continue exploring, given that the notifications are received correctly after AccessibilityInspector is launched.
There is one exception to the above: if I add the kSelectedTextChangedNotification listener to a specific text field AXUIElement, I do receive the notification on that text field. However, this is not practical; I need a solution that will work for all text fields within an app. The Accessibility Inspector appears to be doing something that causes the selected-text-changed notifications to be correctly passed up to the high-level application AXUIElement.
Another thought is that I could traverse the entire Accessibility hierarchy and add listeners to every subview that has the kAXSelectedTextAttribute. However, I don’t like this long-term solution. It will be slow and incomplete: new elements get added and removed frequently. I just want the kAXSelectedTextChangedNotification to be received by the high-level Application AXUIElement, which the documentation suggests it should be. I also have evidence that this can work, since notifications start coming through after Accessibility Inspector is launched. It’s just a matter of discovering how to replicate whatever Accessibility Inspector is doing.
An interesting wrinkle: I implemented the 'traverse' strategy above, but was surprised by how few elements were in the hierarchy. Most apps only go down ~2-3 levels, which didn't seem right to me. Perhaps the Accessibility tree isn't fully initialized? I tried adding a 5-second delay to allow more initialization time, but it didn't change anything.
Does anyone have any ideas? Here's our file.
We have an iOS App built in .NET MAUI (Multi-platform App UI).
This is a web view App.
We wish to integrate APP Clips into this App.
But we are unable to do it, due to less available resources online on such implementation.
We do not wish to share code between .NET MAUI App and App clips.
We understand it is not possible to add APP Clips without a parent swift/Xcode app.
As an alternative solution we were thinking to Create a new APP in APP Store Connect using XCode/swift and integrate app clips to it.
This parent app when downloaded by users will only redirect users to our MAIN .NET MAUI app to app store connect.
We need to know if such apps will be approved by APPSTORE Connect?
Please guide us on this.
Also please do let us know if you have any other solution to integrate App clips to a .NET MAUI App
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include:
Apple Music, Library tab
Photos, Album view
Reminders
In all these cases, VoiceOver reads this element as "More, Button".
In my SwiftUI app, I've implemented a visually identical button.
Menu {
// Button for Menu Item 1
// Button for Menu Item 2
// ...
} label: {
Image(systemName: "ellipsis.circle")
.accessibilityHidden(true)
}
.accessibilityLabel("More")
However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
Hey everyone!
I am developing a screen time limit app to help people spend less time in distracting apps.
It works this way: people choose unhealthy apps for them and opposite productivity apps. In the app you can exchange time spent on healthy habits to scroll or use other distracting apps.
This idea was loved by social media, and the app already has 100k followers on social media without even being launched yet.
So I am waiting just for one feature permission from Apple, and they have not given me any answer since I applied 3 weeks ago.
There are a lot of similar apps on the market, and this feature exists in other screen time limit apps.
Why is app blocking permission needed?
Time Exchange Functionality:
Users independently select which apps are productive and which are distracting for them.
The system blocks the "negative" apps until the user accumulates enough time in the "positive" ones. This encourages healthy device usage.
Full User Control:
All apps to be blocked are manually selected by the user in the settings.
The extension does not impose any restrictions without explicit permission.
Transparency and Security:
Blocking happens locally, with no data collected about app usage.
We adhere to Apple’s privacy policy.
Compliance with App Store Guidelines:
We understand that app blocking is a sensitive feature, but in our case it:
Is used for the benefit of the user (digital detox, productivity improvement).
Does not interfere with system processes or other developers’ apps.
Does not misuse access to APIs.
My question to the forum is:
Did you have similar problems, and how did you resolve them?
Are there any ways to speed up the process or contact someone from the approval team directly?
Should I give up and release it on Android?
I am very disappointed and frustrated. Hope to get some useful tips.
Thank you very much!
Topic:
Accessibility & Inclusion
SubTopic:
General
I have more than 1000 notes classified in parent/child folders up to 5 levels. From the 5th level of files I can no longer share the note. The note is not shared. It is that of the parent file that is shared.
Thank you very much
Good to you
Christophe
Topic:
Accessibility & Inclusion
SubTopic:
General
In iOS18, when a button using @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink,
The button is not accessible via Bluetooth (external) keyboard)
Is this a known isssue in iOS18
Topic:
Accessibility & Inclusion
SubTopic:
General
@Apple Developer Support
Hello,
It has now been more than 72 hours since my Apple Developer Program purchase.
The payment was completed and the invoice was issued.
Order Number: W1302770460
I have not received any response to my previous support requests, and my membership is still not active.
Please escalate this case and complete the manual activation of my membership as soon as possible.
Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
Hey folksI, I would like to ask for help on this topic:
I think this is exactly the same problem Combobox not working with VoiceOver after… - Apple Community.
VoiceOver also breaks the combobox from the official ARIA W3C website https://www.w3.org/WAI/ARIA/apg/patterns/combobox/examples/combobox-autocomplete-list/. When VO is turned off, I can use the up/down arrow to go through the menu items from the dropdown, but when VO is turned on, the up/down arrows cannot access the dropdown menu items.
Is there an official tutorial on how to control it using voice over?
Kind regards,
Jakub
Topic:
Accessibility & Inclusion
SubTopic:
General
Hope it's okay to post here - I haven't gotten resolution anywhere else. Apple's iOs Live Captions is supposed to translate speech into written text either on the phone (works like a charm!) or via microphone (think meeting in a conference room). Microphone doesn't work anywhere, anytime on a new iPhone 14 purchased November 2024. Anyone out there want to fix this and help a lot of people who have trouble hearing? I'm part of an entire generation that didn't know we were supposed to protect our hearing at concerts and clubs and worse, thought it was cool to snag a spot by the speakers...
When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard.
1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard.
TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView.
Is this a known iOS18 issue with ScrollView / any tip to get this fixed ?
Sample code that can reproduce this issue:
struct ContentView: View {
@State private var showBottomSheet: Bool = false
@State private var goToNextView: Bool = false
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
var body: some View {
NavigationView {
VStack {
Text("Hello, world!")
// This button works fine in Bluetooth keyboard in all versions
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Goto another view") {
goToNextView = true
}
NavigationLink(
destination: View2(),
isActive: $goToNextView
) { EmptyView() }
.accessibility(hidden: true)
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
#Preview {
ContentView()
}
struct View2: View {
@FocusState private var focused: Bool
@AccessibilityFocusState private var voFocused: Bool
@State private var showBottomSheet: Bool = false
var body: some View {
ScrollView {
VStack {
Text("check")
// In iOS18, this button doesn't get focused in Bluetooth / external keyboard
// This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink
Button("Trigger a bottomsheet") {
showBottomSheet = true
}
.focused($focused)
.accessibilityFocused($voFocused)
Button("Test button") { }
}
.sheet(isPresented: $showBottomSheet,
onDismiss: {
focused = true
voFocused = true
}, content: {
VStack() {
Text("Hello World ! I'm in a bottomsheet")
Button("Close me") {
showBottomSheet = false
}
}
})
.padding()
}
}
}
Hi,
Our app has a section where, we show to users how to activate "Silence Unknown Callers", because is a crucial feature for our app. But, we saw that 30% of users drop the process here, because we can't open directly that setting option in phone app.
We are using this url scheme to open phone settings in iOS 18:
if let url = URL(string: "App-prefs:com.apple.mobilephone") {
UIApplication.shared.open(url)
}
But, we don't see other way to open directly the path "silence", like in iOS 17, with this url scheme: prefs:root=Phone&path=SILENCE_CALLS
So, do you know if is possible open that option directly? We want to improve our accessibility.
Thank you!