Accessibility Voiceover is not treating navigation bar left button as first focused element.
If we navigate from A->B then the focus is going to first element inside the B view not to the back button or B view's navigation title.
If we post accessibility notification, in onAppear of B, focus is not shifting. but it will read back button first, and then read the B view's content item. it does't focus to back button in swiftUI.
how should I do? if I want to focus on the navigation item back button or navigation title.
my understanding is the system prioritizes the first focusable element in the view hierarchy. but The navigation bar (including the close button and title) is managed separately by the system. It is not part of the main view hierarchy, so it does not automatically receive focus unless explicitly set. if my thoughts are right, it seems a little strange.
Why did you design it this way? Can you tell me your thinking?
Thanks
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I'm developing a calculator app and working to ensure a great experience for both VoiceOver and Braille display users.
For expressions like (2+3)×5, I need two different accessibility outputs:
VoiceOver (spoken): A descriptive string like “left paren two plus three right paren times five,” provided via .accessibilityValue. I'm using a custom spellOut function since VoiceOver doesn't announce parentheses—which are kind of important when doing math!
Braille (symbolic): The literal math string (2+3)×5, provided using .accessibilityCustomContent("", ...), with an empty label so it’s not spoken aloud.
The issue: I don’t have access to a Braille display device and Xcode’s Accessibility Inspector doesn’t seem to show the custom content.
Is there any way to confirm that custom Braille content is being set correctly in Simulator or with other tools?
Or…is there a "math mode" in VoiceOver that forces it to announce parentheses?
Any advice or workarounds would be much appreciated!
Thanks,
Uhl
Topic:
Accessibility & Inclusion
SubTopic:
General
Tags:
External Accessory
iOS
Accessibility
SwiftUI
I created a desktop app for Mac using Xojo. The app has a controller in the main window and displays advertisements and notices on a connected external display.
I'm currently connecting my iMac24 to a REGZA-55M550M via AirPlay, and displaying video from the iMac to the REGZA, but the connection occasionally drops out. Yesterday, the connection dropped about 3.5 hours after connecting. Of course, I have other apps running on the iMac, but I'm not using any operations that would put a strain on the network or memory.
Does AirPlay connection to non-Apple products become unstable over long periods of time?
Watched videos, blog post and downloaded their projects and there the core spot lights works accordingly.
I copied code to an empty project and did the same as what they did but still is not working
os: macOS and iOS
on coredataobject I settled up a attribute to index for spotlight and in object it self I putted the attribute name in display name for spotlight.
static let shared = PersistenceController()
var spotlightDelegate: NSCoreDataCoreSpotlightDelegate?
@MainActor
static let preview: PersistenceController = {
let result = PersistenceController(inMemory: true)
let viewContext = result.container.viewContext
for _ in 0..<10 {
let newItem = Item(context: viewContext)
newItem.timestamp = Date()
}
do {
try viewContext.save()
} catch {
let nsError = error as NSError
fatalError("Unresolved error \(nsError), \(nsError.userInfo)")
}
return result
}()
let container: NSPersistentContainer
init(inMemory: Bool = false) {
container = NSPersistentContainer(name: "SpotLightSearchTest")
if inMemory {
container.persistentStoreDescriptions.first!.url = URL(fileURLWithPath: "/dev/null")
}
container.loadPersistentStores(completionHandler: { [weak self] (storeDescription, error) in
if let error = error as NSError? {
fatalError("Unresolved error \(error), \(error.userInfo)")
}
if let description = self?.container.persistentStoreDescriptions.first {
description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
description.type = NSSQLiteStoreType
if let coordinator = self?.container.persistentStoreCoordinator {
self?.spotlightDelegate = NSCoreDataCoreSpotlightDelegate(
forStoreWith: description,
coordinator: coordinator
)
self?.spotlightDelegate?.startSpotlightIndexing()
}
}
})
container.viewContext.automaticallyMergesChangesFromParent = true
}
}
in my @main view
struct SpotLightSearchTestApp: App {
let persistenceController = PersistenceController.shared
var body: some Scene {
WindowGroup {
ContentView()
.environment(\.managedObjectContext, persistenceController.container.viewContext)
.onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
}
}
}
onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
never gets triggered. Sow What am I missing that they dont explain in the blog post or videos ?
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode.
Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work?
Cannot make FaceTime calls with favorite contacts.
Find My iPhone cannot jump to the maps app.
Camera cannot zoom in or out.
Photos cannot be deleted, edited, or shared in a shared album in the photos app.
Photos/videos cannot be sent in messages.
Spotify cannot be accessed from the lock screen.
Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks).
There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
Hello,
I'm currently unable to access App Store Connect. When I try to open https://appstoreconnect.apple.com, I receive the following error message:
“appstoreconnect.apple.com is currently unable to handle this request.”
I’ve tried the following steps, but the issue persists:
Cleared browser cache and cookies
Tried different browsers (Safari, Chrome)
Attempted from multiple devices and networks
Is this a known issue or is there any workaround available?
Would appreciate any help or update on the current status.
Thank you,
Topic:
Accessibility & Inclusion
SubTopic:
General
I am building a language learning app for a Unlisted Primary Language. Any suggestions or heads ups? My plan is to select english and go with it.
Its unfortunate that I have to list a language learning app incorrectly and a tag for that language probably does not exist across the apple system.
I have just purchased an Apple developer Account from Bangladesh and it sent codes perfectly for the first 2-3 times for logging in, but after that no matter what I do it doesn't send verification code and I am stuck, I now cannot log in and this is extremely extremely frustrating
Since UITextView does not support the zoom function, the zoom function of UITextView with addSubview is used in UIScrollView.
However, when I use the link here, the text behind it is missing.
Ex) https://appstoreconnect.apple.com/login\nApple Developer Login
-> The text “Apple Developer Login” does not appear.
If anyone has experienced the same problem as me or knows a solution, please leave a comment.
Note)
It is working normally in iOS16, but the text behind the link disappears in iOS18.
The text is not visible, but you can copy it and paste it to retrieve the missing text.
Topic:
Accessibility & Inclusion
SubTopic:
General
When VoiceOver reads decimal numbers with six or more digits after the decimal, it stops announcing the decimal separator and also adds pauses between each digit.
Text("0.12345") // VoiceOver: "zero **point** one two three four five"
Text("0.123456") // VoiceOver: "zero one, two, three, four, five, six"
How can I force VoiceOver to announce the decimal separator ("point") and not insert pauses regardless of the number of decimal digits?
Double-tap three fingers and drag to change zoom” should suppress “Three Finger to Drag”. Currently these gestures are triggered simultaneously, for no real reasons. I saw different behaviors for different environments, but none is desired.
Current and desired behavior:
This seems an issue so I filed a feedback.
Say I have a UI element that moves on the screen. Is it possible to update its accessibility frame as it moves while VoiceOver is focused on it? From my tests, VoiceOver ignores UIAccessibilityLayoutChangedNotification if it's sent repeatedly in a short period of time on iOS, while sending NSAccessibilityLayoutChangedNotification on macOS triggers VoiceOver to reannounce the focused element repeatedly.
I have a TabView with a sample tabItem as follows:
.tabItem {
Label ("Import", systemImage:"doc.on.doc")
.accessibilityLabel("Import Text")
}
But accessibility settings for large display size on does not seem to work, nor do dynamic font sizes:
.tabItem {
Label ("Import", systemImage:"doc.on.doc")
.font(.largeTitle)
.accessibilityLabel("Import Text")
}
The tabItems appear as a fixed size. The tab contents scale well, so this does not look pleasant at all.
Is this a known bug in SwiftUI?
Topic:
Accessibility & Inclusion
SubTopic:
General
I have lost access to to my account I found out via email stating that my account was closed because I had not agreed to new terms and conditions which I had no idea there were new terms and conditions and I’ve trying desperately to regain my account and nobody’s contacted me after I’ve sent multiple emails asking for help
Hi everyone,
I enrolled in the Apple Developer Program on the evening of December 26, 2025, and the membership fee has already been successfully charged to my bank account. However, my account is still showing a “Pending” status with the message “Subscribe your membership.”
At this point, some time has passed and I haven’t received a confirmation email or any follow-up requesting additional information.
Topic:
Accessibility & Inclusion
SubTopic:
General
Should I allow the CIJSULAgent to find devices on local network?
I have an iOS application that derives on UITextInput to enter text. I have also overridden pressesBegan() and pressesEnded() in order to have some extra keyboard management (auto-repeat, special actions for arrow keys, function keys...)
That works well for single-character languages (most roman languages, such as English, French, etc).
For multi-character languages (Chinese, Japanses, Korean, Hindi), I can detect that the keyboard has been set to that language, and switch back to the default version of pressesBegan():
if let keyboardLanguage = self.textInputMode?.primaryLanguage {
if (keyboardLanguage.hasPrefix("hi") || keyboardLanguage.hasPrefix("zh") || keyboardLanguage.hasPrefix("ja") || keyboardLanguage.hasPrefix("ko")) {
super.pressesBegan(presses, with: event)
}
}
But that strategy fails with the Tiếng Việt Telex keyboard (for Vietnamese language input). The way that keyboard works (as you can see if you open a document in Pages) is that you type as you go: T-i-e-n-g V-i-e-t T-e-l-e-x and the system adds the relevant diacritics once you've finished a word, so typing "Tieng Viet Telex" gives you "Tiếng Việt Telex" on the screen.
Is there any documentation on the inner workings of this specific keyboard? What should I do (or not do) in order to make my application compatible with Tiếng Việt Telex?
I have a UIImageView as the background of a custom UIView subclass. The image itself does not contain any text. On top of this image view, I have added two UILabels.
To improve accessibility, I converted the entire view into a single accessibility element and set a proper accessibilityLabel. Additionally, I disabled accessibility for the UIImageView and the labels by setting isAccessibilityElement = false.
However, when VoiceOver's Accessibility Recognition's Text Recognition feature is enabled, VoiceOver still detects and announces the text inside the UILabels at the end after reading my custom accessibility properties. This text should not be announced.
It seems that VoiceOver treats the UILabel content as part of the UIImageView. Additionally, when using the Explore Image rotor action, the entire subview is recognized as a single image.
Is this the expected behavior? If so, is there a way to disable VoiceOver’s text recognition for this view while keeping custom accessibility intact?
class BackgroundLabelView: UIView {
private let backgroundImageView = UIImageView()
private let backgroundImageView2 = UIImageView()
private let titleLabel = UILabel()
private let subtitleLabel = UILabel()
override init(frame: CGRect) {
super.init(frame: frame)
setupView()
}
required init?(coder: NSCoder) {
super.init(coder: coder)
setupView()
configureAceesibility()
}
private func configureAceesibility() {
backgroundImageView.isAccessibilityElement = false
backgroundImageView2.isAccessibilityElement = false
titleLabel.isAccessibilityElement = false
subtitleLabel.isAccessibilityElement = false
isAccessibilityElement = true
accessibilityTraits = .button
}
func configure(backgroundImage: UIImage?, title: String, subtitle: String) {
backgroundImageView.image = backgroundImage
titleLabel.text = title
subtitleLabel.text = subtitle
accessibilityLabel = "Holiday Offer ," + title + "," + subtitle
}
private func setupView() {
backgroundImageView2.contentMode = .scaleAspectFill
backgroundImageView2.clipsToBounds = true
backgroundImageView2.translatesAutoresizingMaskIntoConstraints = false
backgroundImageView2.image = UIImage(resource: .bannerfestival)
addSubview(backgroundImageView2)
backgroundImageView.contentMode = .scaleAspectFit
backgroundImageView.clipsToBounds = true
backgroundImageView.translatesAutoresizingMaskIntoConstraints = false
addSubview(backgroundImageView)
titleLabel.font = UIFont.systemFont(ofSize: 18, weight: .bold)
titleLabel.textColor = .white
titleLabel.translatesAutoresizingMaskIntoConstraints = false
titleLabel.numberOfLines = 0
addSubview(titleLabel)
subtitleLabel.font = UIFont.systemFont(ofSize: 14, weight: .regular)
subtitleLabel.textColor = .white.withAlphaComponent(0.8)
subtitleLabel.translatesAutoresizingMaskIntoConstraints = false
subtitleLabel.numberOfLines = 0
addSubview(subtitleLabel)
NSLayoutConstraint.activate([
backgroundImageView2.leadingAnchor.constraint(equalTo: leadingAnchor),
backgroundImageView2.trailingAnchor.constraint(equalTo: trailingAnchor),
backgroundImageView2.heightAnchor.constraint(equalToConstant: 200),
backgroundImageView.centerYAnchor.constraint(equalTo: centerYAnchor),
backgroundImageView.topAnchor.constraint(equalTo: topAnchor),
backgroundImageView.leadingAnchor.constraint(greaterThanOrEqualTo: leadingAnchor),
backgroundImageView.trailingAnchor.constraint(equalTo: trailingAnchor),
backgroundImageView.bottomAnchor.constraint(equalTo: bottomAnchor),
titleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16),
titleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor),
titleLabel.bottomAnchor.constraint(equalTo: centerYAnchor, constant: -4),
subtitleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16),
subtitleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor),
subtitleLabel.topAnchor.constraint(equalTo: centerYAnchor, constant: 4)
])
}
override func layoutSubviews() {
super.layoutSubviews()
backgroundImageView.layer.cornerRadius = layer.cornerRadius
}
}
Has the NEHotspotNetwork framework removed the fetchAvailableNetworksWithCompletionHandler API? xcode version: 16.2 ios sdk: 18.2
I want to get wifi's list by NEHotspotNetwork framework, but i didn't find any useful infomation about it
Hi everybody,
I'm trying to build a QR-Code Scanner and Generator App for IOS.
Whenever I try to implement the camera the app crashes with this comment:
This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data.
I tried to reduce the app to the minimum of nothing but camera with the same result.
Any ideas?
Tank you and
best Regards
Horst Schippers
Topic:
Accessibility & Inclusion
SubTopic:
General