I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need?
For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient.
Is there a better way to handle this using VoiceOver?
Explore best practices for creating inclusive apps for users of Apple accessibility features and users from diverse backgrounds.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar.
How can I achieve the same behavior for a custom view?
I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
VoiceOver reads out all visible content on the screen, which is essential for visually challenged users. However, this raises a privacy concern—what if a user accidentally focuses on sensitive information, like a bank account password, and it gets read aloud?
How can developers prevent VoiceOver from exposing confidential data while still maintaining accessibility? Are there best practices or recommended approaches to handle such scenarios effectively?
I have implemented a SwiftUI view containing a grid of TextField elements, where focus moves automatically to the next field upon input. This behavior works well on iOS 16 and 17, maintaining proper focus highlighting when keyboard full access is enabled.
However, in iOS 18 and above, the keyboard full access focus behaves differently. It always stays behind the actual focus state, causing a mismatch between the visually highlighted field and the active text input. This leads to usability issues, especially for users navigating with an external keyboard.
Below is the SwiftUI code for reference:
struct AutoFocusGridTextFieldsView: View {
private let fieldCount: Int
private let columns: Int
@State private var textFields: [String]
@FocusState private var focusedField: Int?
init(fieldCount: Int = 17, columns: Int = 5) {
self.fieldCount = fieldCount
self.columns = columns
_textFields = State(initialValue: Array(repeating: "", count: fieldCount))
}
var body: some View {
let rows = (fieldCount / columns) + (fieldCount % columns == 0 ? 0 : 1)
VStack(spacing: 10) {
ForEach(0..<rows, id: \.self) { row in
HStack(spacing: 10) {
ForEach(0..<columns, id: \.self) { col in
let index = row * columns + col
if index < fieldCount {
TextField("", text: $textFields[index])
.frame(width: 40, height: 40)
.multilineTextAlignment(.center)
.textFieldStyle(RoundedBorderTextFieldStyle())
.focused($focusedField, equals: index)
.onChange(of: textFields[index]) { newValue in
if newValue.count > 1 {
textFields[index] = String(newValue.prefix(1))
}
if !textFields[index].isEmpty {
moveToNextField(from: index)
}
}
}
}
}
}
}
.padding()
.onAppear {
focusedField = 0
}
}
private func moveToNextField(from index: Int) {
if index + 1 < fieldCount {
focusedField = index + 1
}
}
}
struct AutoFocusGridTextFieldsView_Previews: PreviewProvider {
static var previews: some View {
AutoFocusGridTextFieldsView(fieldCount: 10, columns: 5)
}
}
Has anyone else encountered this issue with FocusState in iOS 18?
I really do believe that this is a bug strictly connected to keyboard navigation since I experienced similar problem also on UIKit equivalent of the view.
Any insights or suggestions would be greatly appreciated!
Hello,
When I listen to title in my app with VoiceOver, it makes a strange sound.
This characters make with Korean+number+Alphabet.
Is this combination makes some strange sound with voice over?
I would like to ask if Apple can fix this issue.
Thank you.
Topic:
Accessibility & Inclusion
SubTopic:
General
I’m trying to add the .header accessibility trait to a UISegmentedControl so that VoiceOver recognizes it accordingly. However, setting the trait using the following code doesn’t seem to have any effect:
segmentControl.accessibilityTraits = segmentControl.accessibilityTraits.union(.header)
Even after applying this, VoiceOver doesn’t announce it as a header. Is there any workaround or recommended approach to achieve this?
In SwiftUI, the date picker component is breaking in colour contrast accessibility. Below code has been use to create date picker:
struct ContentView: View {
@State private var date = Date()
@State private var selectedDate: Date = .init()
var body: some View {
let min = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
let max = Calendar.current.date(byAdding: .year, value: 4, to: Date()) ?? Date()
DatePicker(
"Start Date",
selection: $date,
in: min ... max,
displayedComponents: [.date]
)
.datePickerStyle(.graphical)
.frame(alignment: .topLeading)
.onAppear {
selectedDate = Calendar.current.date(byAdding: .day, value: 14, to: Date()) ?? Date()
}
}
}
#Preview {
ContentView()
}
attaching the screenshot of failure accessibility.
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients.
We need to know which item has the VoiceOver focus so we can keep track of it.
setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused).
At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder.
Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element?
Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking
If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
Hope it's okay to post here - I haven't gotten resolution anywhere else. Apple's iOs Live Captions is supposed to translate speech into written text either on the phone (works like a charm!) or via microphone (think meeting in a conference room). Microphone doesn't work anywhere, anytime on a new iPhone 14 purchased November 2024. Anyone out there want to fix this and help a lot of people who have trouble hearing? I'm part of an entire generation that didn't know we were supposed to protect our hearing at concerts and clubs and worse, thought it was cool to snag a spot by the speakers...
I use AttributedString to create a string containing a link. And I set the AttributedString to UILabel. How should I set up the Accessibility feature to make sure that
I can keyboard focus on the substring with link and use keyboard operation to open the link
I can VoiceOver the whole string and VoiceOver the substring with link to open the link
Thanks a lot.
Topic:
Accessibility & Inclusion
SubTopic:
General
Watched videos, blog post and downloaded their projects and there the core spot lights works accordingly.
I copied code to an empty project and did the same as what they did but still is not working
os: macOS and iOS
on coredataobject I settled up a attribute to index for spotlight and in object it self I putted the attribute name in display name for spotlight.
static let shared = PersistenceController()
var spotlightDelegate: NSCoreDataCoreSpotlightDelegate?
@MainActor
static let preview: PersistenceController = {
let result = PersistenceController(inMemory: true)
let viewContext = result.container.viewContext
for _ in 0..<10 {
let newItem = Item(context: viewContext)
newItem.timestamp = Date()
}
do {
try viewContext.save()
} catch {
let nsError = error as NSError
fatalError("Unresolved error \(nsError), \(nsError.userInfo)")
}
return result
}()
let container: NSPersistentContainer
init(inMemory: Bool = false) {
container = NSPersistentContainer(name: "SpotLightSearchTest")
if inMemory {
container.persistentStoreDescriptions.first!.url = URL(fileURLWithPath: "/dev/null")
}
container.loadPersistentStores(completionHandler: { [weak self] (storeDescription, error) in
if let error = error as NSError? {
fatalError("Unresolved error \(error), \(error.userInfo)")
}
if let description = self?.container.persistentStoreDescriptions.first {
description.setOption(true as NSNumber, forKey: NSPersistentHistoryTrackingKey)
description.type = NSSQLiteStoreType
if let coordinator = self?.container.persistentStoreCoordinator {
self?.spotlightDelegate = NSCoreDataCoreSpotlightDelegate(
forStoreWith: description,
coordinator: coordinator
)
self?.spotlightDelegate?.startSpotlightIndexing()
}
}
})
container.viewContext.automaticallyMergesChangesFromParent = true
}
}
in my @main view
struct SpotLightSearchTestApp: App {
let persistenceController = PersistenceController.shared
var body: some Scene {
WindowGroup {
ContentView()
.environment(\.managedObjectContext, persistenceController.container.viewContext)
.onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
}
}
}
onContinueUserActivity(CSSearchableItemActionType) {_ in
print("")
}
never gets triggered. Sow What am I missing that they dont explain in the blog post or videos ?
In our application we are using UITableView for data population and that TableView cell contains a button. When we are enabling full keyboard access that time only TableView cell is focusing not the button. We need to focus on cell and button differently.
In our application we are using a Search bar in a pop over view and we have enabled Accessibility full keyboard access and we are using external keyboard. Now if the focus is on Searcher that time by next Tab key press Search bar will dismiss and focus needs to shift to the next UIElement.
In our application we are using OTP login. When accessibility full keyboard access is enabled, and we are trying to enter OTP in the OTP field that time in iOS 17 focus is moving to the next text field accordingly but in iOS 18 focus is staying the first OTP field only and not moving to the next text field.
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
again and again this issue is coming , restarted my laptop, have storage , I don't why this issue is coming!!
Topic:
Accessibility & Inclusion
SubTopic:
General
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode.
Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work?
Cannot make FaceTime calls with favorite contacts.
Find My iPhone cannot jump to the maps app.
Camera cannot zoom in or out.
Photos cannot be deleted, edited, or shared in a shared album in the photos app.
Photos/videos cannot be sent in messages.
Spotify cannot be accessed from the lock screen.
Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks).
There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
Hello,
the AVSpeechSynthesisVoice has a audioFileSettings attributes
let utterance = AVSpeechUtterance(string: text)
utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!)
print("- voice \(utterance.voice!.audioFileSettings)")
["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32]
This is declared in
AVSpeechSynthesisVoice {
...
@available(iOS 13.0, *)
open var **audioFileSettings:** [String : Any] { get }
@available(iOS 17.0, *)
open var voiceTraits: AVSpeechSynthesisVoice.Traits { get }
}
How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ?
Cause in AVSpeechSynthesisProviderVoice there is no such field
AVSpeechSynthesisProviderVoice {
open var name: String { get }
open var identifier: String { get }
open var primaryLanguages: [String] { get }
open var supportedLanguages: [String] { get }
open var voiceSize: Int64
open var version: String
open var gender: AVSpeechSynthesisVoiceGender
open var age: Int
}
Regards
Topic:
Accessibility & Inclusion
SubTopic:
General
My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token.
We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
SwiftUI provides the accessibilityCustomContent(_:_:) modifier to add additional accessibility information for an element. However, I couldn’t find a similar approach in UIKit.
Is there a way to achieve this in UIKit?