Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Apple Vision Pro - Homonymous Hemianopia
Individuals with a stroke can end up with vision impairments: specifically Homonymous Hemianopia which basically means the individual has lost sight in (as an example) the left half of both eyes. I'm interested in understanding if it would be possible to help individuals with this vision impairment by providing an accessibility config within the Apple Vision Pro which would first determine an individuals field of view (possibly by showing a field of dots across the entire "screen" and having the individual look at the dot and click. Based on the results of this field of view, this would determine how the screen would be presented to the user moving forward. My mom (82 years old) had a stroke recently and was diagnosed with Homonymous Hemianopia. She lived on her IPhone and would love to get back the ability to text message, use Facebook, and order items from Amazon. Please advise if you believe the Apple Vision Pro would be capable of helping in this area with the suggested development, or other thoughts.
1
0
492
Jan ’25
I can’t create an iTunes Connect account.
I learned that I need to create an iTunes Connect account to publish my book translations on Apple Books, following the instructions on the Apple Support page. I was then directed to the iTunes Connect website. Despite trying multiple Apple accounts with different credit cards on my Mac, iPad, and iPhone, I kept getting the error “This Apple Account does not have a valid credit card on file.” In the end, I began to wonder if iTunes Connect is unavailable in Turkey. What do I need to do to publish content on Apple Books from Turkey? Do I need to obtain a developer account, or is this service not available in Turkey? The Apple customer service representative I contacted in Turkey said they didn’t have information on the matter and directed me here.
1
0
467
Mar ’25
Proposal: Using ARKit Body Tracking & LiDAR for Sign Language Education (Real-time Feedback)
Hi everyone, I’ve been analyzing the current state of Sign Language accessibility tools, and I noticed a significant gap in learning tools: we lack real-time feedback for students (e.g., "Is my hand position correct?"). Most current solutions rely on 2D video processing, which struggles with depth perception and occlusion (hand-over-hand or hand-over-face gestures), which are critical in Sign Language grammar. I'd like to propose/discuss an architecture leveraging the current LiDAR + Neural Engine capabilities found in iPhone devices to solve this. The Concept: Skeleton-based Normalization Instead of training ML models on raw video frames (which introduces noise from lighting, skin tone, and clothing), we could use ARKit's Body Tracking to abstract the input. Capture: Use ARKit/LiDAR to track the user's upper body and hand joints in 3D space. Data Normalization: Extract only the vector coordinates (X, Y, Z of joints). This creates a "clean" dataset, effectively normalizing the user regardless of physical appearance. Comparison: Feed these vectors into a CoreML model trained on "Reference Skeletons" (recorded by native signers). Feedback Loop: The app calculates the geometric distance between the user's pose and the reference pose to provide specific correction (e.g., "Raise your elbow 10 degrees"). Why this approach? Solves Occlusion: LiDAR handles depth much better than standard RGB cameras when hands cross the body. Privacy: We are processing coordinates, not video streams. Efficiency: Comparing vector sequences is computationally cheaper than video analysis, preserving battery life. Has anyone experimented with using ARKit Body Anchors specifically for comparing complex gesture sequences against a stored "correct" database? I believe this "Skeleton First" approach is the key to scalable Sign Language education apps. Looking forward to hearing your thoughts.
1
0
557
Dec ’25
Imessage and Facetime error
Yesterday I installed iOS 26 on my iPhone as a beta tester. At first there was no problem, but during the afternoon I noticed that neither FaceTime nor IMessage worked... I tried to go through the settings as described by Apple Support, but my phone number would not activate. Sometimes I was even asked to activate iCloud. I always get a REG-RESP message. Does anyone have any ideas what the problem could be?
1
1
150
Jun ’25
Size of stylus mesh tip
Hello community, We're designing an app that can optionally be controlled by a stylus with a mesh tip. In this case, the mesh tip we're using is 5 mm in diameter. It seems that mesh tip contact detection is unstable in this size, although it works better with a larger diameter. Is it possible to access a setting in iOS that lets you define the minimum contact area needed to detect a contact on the screen? This would enable us to use this 5 mm stylus. Best regards, Edwin
1
0
382
Feb ’25
Verification error: unable to get local issuer certificate
C:\Users\xjc>openssl s_client -connect gateway.push.apple.com:2195 -showcerts Connecting to 17.188.183.32 CONNECTED(000000AC) depth=1 C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K verify error:num=20:unable to get local issuer certificate verify return:1 depth=0 C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com verify return:1 B0640000:error:0A000410:SSL routines:ssl3_read_bytes:ssl/tls alert handshake failure:ssl\record\rec_layer_s3.c:908:SSL alert number 40 Certificate chain 0 s:C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com i:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K a:PKEY: rsaEncryption, 2048 (bit); sigalg: RSA-SHA256 v:NotBefore: Aug 16 21:34:09 2024 GMT; NotAfter: Aug 15 21:34:07 2025 GMT -----BEGIN CERTIFICATE----- MIIGqDCCBZCgAwIBAgIQCUjuxVwL1mhSlrjSSk/+BzANBgkqhkiG9w0BAQsFADCB WnKd+td/wZ6Ej6EB mDF8JCSKz/ck+NnLfGM0jFdcTCl8dKuqM9XetP4ls1sVyUuLM7sJiQvMVDzluZ22 LA9EMc5ZcbdV96ZpKS3ETk5n7355fyVX+jZ24ZvfhtdyPvdUGuHzcrK/YfB0AsjY hIhXgkxMfqJDjj7Af1CDPSAv9cylGI5b9v5QX93pM8uGxSRZTGS5m4qJG0Jj4UpV QlzppFg+qE41yDrdy4rLxROW4bp/HPvEjo1YoAle3K208UMffVPBqGfZqbZ01+hP gHCeamBb6QlV2Zq6q/VEKUO6p6oFQnI0phQiAQ== -----END CERTIFICATE----- 1 s:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K i:C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2009 Entrust, Inc. - for authorized use only, CN=Entrust Root Certification Authority - G2 a:PKEY: rsaEncryption, 2048 (bit); sigalg: RSA-SHA256 v:NotBefore: Oct 5 19:13:56 2015 GMT; NotAfter: Dec 5 19:43:56 2030 GMT -----BEGIN CERTIFICATE----- MIIFDjCCA/agAwIBAgIMDulMwwAAAABR03eFMA0GCSqGSIb3DQEBCwUAMIG+MQsw CQYDVQQGEwJVUzEWMBQGA1UEChMNRW50cnVzdCwgSW5jLjEoMCYGA1UECxMfU2Vl IHd3dy5lbnRydXN0Lm5ldC9sZWdhbC10ZXJtczE5MDcGA1UECxMwKGMpIDIwMDkg RW50cnVzdCwgSW5jLiAtIGZvciBhdXRob3JpemVkIHVzZSBvbmx5MTIwMAYDVQQD EylFbnRydXN0IFJvb3QgQ2VydGlmaWNhdGlvbiBBdXRob3JpdHkgLSBHMjAeFw0x NTEwMDUxOTEzNTZaFw0zMDEyMDUxOTQzNTZaMIG6MQswCQYDVQQGEwJVUzEWMBQG A1UEChMNRW50cnVzdCwgSW5jLjEoMCYGA1UECxMfU2VlIHd3dy5lbnRydXN0Lm5l dC9sZWdhbC10ZXJtczE5MDcGA1UECxMwKGMpIDIwMTIgRW50cnVzdCwgSW5jLiAt IGZvciBhdXRob3JpemVkIHVzZSBvbmx5MS4wLAYDVQQDEyVFbnRydXN0IENlcnRp ZmljYXRpb24gQXV0aG9yaXR5IC0gTDFLMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8A MIIBCgKCAQEA2j+W0E25L0Tn2zlem1DuXKVh2kFnUwmqAJqOV38pa9vH4SEkqjrQ jUcj0u1yFvCRIdJdt7hLqIOPt5EyaM/OJZMssn2XyP7BtBe6CZ4DkJN7fEmDImiK m95HwzGYei59QAvS7z7Tsoyqj0ip/wDoKVgG97aTWpRzJiatWA7lQrjV6nN5ZGhT JbiEz5R6rgZFDKNrTdDGvuoYpDbwkrK6HIiPOlJ/915tgxyd8B/lw9bdpXiSPbBt LOrJz5RBGXFEaLpHPATpXbo+8DX3Fbae8i4VHj9HyMg4p3NFXU2wO7GOFyk36t0F ASK7lDYqjVs1/lMZLwhGwSqzGmIdTivZGwIDAQABo4IBDDCCAQgwDgYDVR0PAQH/ BAQDAgEGMBIGA1UdEwEB/wQIMAYBAf8CAQAwMwYIKwYBBQUHAQEEJzAlMCMGCCsG AQUFBzABhhdodHRwOi8vb2NzcC5lbnRydXN0Lm5ldDAwBgNVHR8EKTAnMCWgI6Ah hh9odHRwOi8vY3JsLmVudHJ1c3QubmV0L2cyY2EuY3JsMDsGA1UdIAQ0MDIwMAYE VR0gADAoMCYGCCsGAQUFBwIBFhpodHRwOi8vd3d3LmVudHJ1c3QubmV0L3JwYTAd BgNVHQ4EFgQUgqJwdN28Uz/Pe9T3zX+nYMYKTL8wHwYDVR0jBBgwFoAUanImetAe 733nO2lR1GyNn5ASZqswDQYJKoZIhvcNAQELBQADggEBADnVjpiDYcgsY9NwHRkw y/YJrMxp1cncN0HyMg/vdMNY9ngnCTQIlZIv19+4o/0OgemknNM/TWgrFTEKFcxS BJPok1DD2bHi4Wi3Ogl08TRYCj93mEC45mj/XeTIRsXsgdfJghhcg85x2Ly/rJkC k9uUmITSnKa1/ly78EqvIazCP0kkZ9Yujs+szGQVGHLlbHfTUqi53Y2sAEo1GdRv c6N172tkw+CNgxKhiucOhk3YtCAbvmqljEtoZuMrx1gL+1YQ1JH7HdMxWBCMRON1 exCdtTix9qrKgWRs6PLigVWXUX/hwidQosk8WwBD9lu51aX8/wdQQGcHsFXwt35u Lcw= -----END CERTIFICATE----- Server certificate subject=C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com issuer=C=US, O=Entrust, Inc., OU=See www/legal-terms, OU=(c) 2012 Entrust, Inc. - for authorized use only, CN=Entrust Certification Authority - L1K Acceptable client certificate CA names C=US, O=Apple Inc., OU=Apple Certification Authority, CN=Apple Root CA CN=Apple Worldwide Developer Relations Certification Authority, OU=G4, O=Apple Inc., C=US CN=Apple Application Integration 2 Certification Authority, OU=Apple Certification Authority, O=Apple Inc., C=US CN=Apple Corporate Authentication CA 1, OU=Certification Authority, O=Apple Inc., C=US C=US, O=Apple Inc., OU=Apple Worldwide Developer Relations, CN=Apple Worldwide Developer Relations Certification Authority CN=Apple Corporate Root CA, OU=Certification Authority, O=Apple Inc., C=US C=US, O=Apple Inc., OU=Apple Certification Authority, CN=Apple Application Integration Certification Authority C=US, ST=California, L=Cupertino, O=Apple Inc., CN=gateway.push.apple.com Client Certificate Types: RSA sign, ECDSA sign Requested Signature Algorithms: ECDSA+SHA256:RSA-PSS+SHA256:RSA+SHA256:ECDSA+SHA384:RSA-PSS+SHA384:RSA+SHA384:RSA-PSS+SHA512:RSA+SHA512:RSA+SHA1 Shared Requested Signature Algorithms: ECDSA+SHA256:RSA-PSS+SHA256:RSA+SHA256:ECDSA+SHA384:RSA-PSS+SHA384:RSA+SHA384:RSA-PSS+SHA512:RSA+SHA512 SSL handshake has read 4138 bytes and written 687 bytes Verification error: unable to get local issuer certificate New, SSLv3, Cipher is AES128-SHA Protocol: TLSv1.2 Server public key is 2048 bit Secure Renegotiation IS supported Compression: NONE Expansion: NONE No ALPN negotiated SSL-Session: Protocol : TLSv1.2 Cipher : AES128-SHA Session-ID: Session-ID-ctx: Master-Key: D504C13BDBC59CDF3B883D1B626FA2B59000754DED57CD77A72F761A52AEED719DA06C100FBA1430BB9D8DECFC7C9307 PSK identity: None PSK identity hint: None SRP username: None Start Time: 1741092949 Timeout : 7200 (sec) Verify return code: 20 (unable to get local issuer certificate) Extended master secret: yes
1
0
538
Mar ’25
Developer Program enrollment still pending after payment
Hi everyone, I enrolled in the Apple Developer Program on the evening of December 26, 2025, and the membership fee has already been successfully charged to my bank account. However, my account is still showing a “Pending” status with the message “Subscribe your membership.” At this point, some time has passed and I haven’t received a confirmation email or any follow-up requesting additional information.
1
0
328
2d
Unable to set dialect of Chinese of AVSpeechSynthesisVoice in iOS 18
The AVSpeechSynthesizer on some iOS 18 device has a bug that it will read always read Chinese of: AVSpeechUtterance(string: "中文") // Any Chinese Content in the dialect specified by: Settings > Accessibility > Spoken Content > Voices > Chinese > Spoken Language instead of the dialect that I specified in AVSpeechUtterance.voice: AVSpeechSynthesisVoice(language: "zh-HK") // Cantonese AVSpeechSynthesisVoice(language: "zh-TW") // Mandarin However, setting Chinese dialect of AVSpeechSynthesisVoice by "zh-HK" or "zh-TW" has been working on iOS 17 and below. My app has a feature that requires reading sentences in Mandarin followed by Cantonese, i.e., both dialects is needed every time. Therefore, setting the dialect in Spoken Language of Settings is not a workaround to make my app to function correctly in iOS 18. Further to the above, I've also discovered that, if iOS 18 (in my case, 18.5 is tested) is freshly installed (not upgrading from iOS 17 or below, nor restoring backup after fresh installation of iOS 18), the bug above will not happen. However, if it was an upgrade from iOS 17 or below, or backup is restored (in my case, I freshly installed iOS 18.5 on a new iPhone and then restored a backup from another iPhone on iOS 16.2), the bug above happens. This bug puzzled me because I need both dialect of Chinese to be read aloud one by one, but as reported by many users, on most iOS 18 devices (since a fresh installation of latest iOS without upgrading or restoring is uncommon nowadays), my app will read Cantonese two times or Mandarin two times (depending on Spoken Language in Settings). It is the iOS 18 bug which made my app unable to perform the expected behavior. Would Apple developers look into this and advise if there are any possible workaround within the code of app to overcome this bug, or please fix this bug with an iOS 18 update. Thank you.
1
1
104
Jun ’25
Is there any way to make user forced update?
Hi I'm planning to make macos App and distribute to MacOS App Store. The question is should i make force update when update is needed. The reason why I want to make this feature is I don't want to make user use previous version of app. My plan is like this. when app needed update, make user reach special page that describe why update is needed and set a button that can download new version of app. the download will be automatically doing at background don't need to visit app store. I search several forums and gpt but there is no positive reply of this.. so finally i make a post to know is there no way to make this. Thank you!
1
0
472
Mar ’25
SwiftUI PhotosPicker accessibility issue
iOS 18.3.1, iPhone 16 Pro. I pick photos using connected physical keyboard from the user's photo library using: .photosPicker(isPresented: $viewModel.isImagePickerPresented, selection: $viewModel.selectedImageItem, matching: .images) When picker appears, accessibility focus is moved to "dynamic Island" instead of cancel button. There is no possibility to navigate by keyboard in photos picker view without tapping on this view and move focus to this view manually . I noticed the same behavior in Notes app.
1
0
526
Mar ’25
AirPods Pro 3 HRV Data Access Through HealthKit?
Hey everyone I'm working on a health app that's heavily focused on HRV tracking and analysis, and I'm trying to figure out what's actually possible with AirPods Pro 3 from a developer standpoint. The hardware clearly has a much better heart rate sensor than the previous generation, but I'm hitting some walls when it comes to actually accessing the data I need. So here's the situation I'm dealing with: When I query HealthKit for HRV samples, I'm not seeing anything coming from AirPods Pro 3. The device is obviously capable of tracking heart rate continuously during workouts and listening sessions, and from what I've read about the hardware, it should theoretically be able to capture the inter-beat intervals needed for HRV calculation. But either that data isn't being processed on-device, or it's just not being made available through the standard HealthKit data types that third-party apps can access. What I'm really after is either direct HRV metrics (like SDNN, which Apple Watch already provides through HKQuantityTypeIdentifierHeartRateVariabilitySDNN) or even better, access to the raw R-R interval data. With R-R intervals, I could calculate RMSSD, pNN50, and other time-domain and frequency-domain HRV metrics that are super valuable for tracking recovery, autonomic nervous system balance, and stress levels. This would be especially useful since a lot of users wear AirPods during activities when they're not wearing their Apple Watch. Has anyone managed to find a way to pull this data from AirPods Pro 3? Are there any private frameworks or entitlements I should be looking into? Or is this just fundamentally not exposed to developers at the OS level right now? I've gone through the HealthKit documentation pretty thoroughly and haven't found anything that specifically addresses this, but I'm wondering if I'm missing something or if there are any known workarounds. I'm also curious if anyone has heard anything from Apple about future plans to expose this data. It seems like a missed opportunity given how capable the hardware is and how much value developers could provide with access to this physiological data. Would love to hear if anyone else is working on similar features or has insights into the technical limitations here.
1
0
640
Oct ’25
Feature Idea: Autonomous, Motion-Powered Clock Display on iPhone.
Hey everyone, I've been thinking about a truly innovative way to enhance iPhone battery life and user convenience, drawing inspiration from kinetic energy harvesting. What if we could have a clock display on the main iPhone screen that's powered purely by user motion, and activates only when you look at it, without touching your main battery? The Core Idea Imagine this: Kinetic Energy Harvesting: Your iPhone would have a tiny, integrated kinetic energy generator. This generator would capture the energy from your everyday movements – walking, picking up the phone, putting it in your pocket. Independent Power Source: This harvested energy would be stored in a small, dedicated capacitor or micro-battery, completely separate from your iPhone's main battery. Acelerometer-Activated Display: Instead of relying on power-hungry facial recognition, the phone's accelerometer (a very low-power sensor) would detect specific "raise to wake" or "tap to look" gestures. On-Demand, Ultra-Low Power Clock: Only when the accelerometer detects one of these specific gestures would the stored kinetic energy be used to illuminate just the necessary pixels on the main OLED/AMOLED screen to display the time. The rest of the screen stays completely black (consuming no power on OLED). Automatic Shut-Off: As soon as the gesture ends or the phone is put down, the clock display would turn off, conserving the limited harvested energy. Why This Matters This isn't just a cool gimmick; it offers significant benefits: True Battery Independence: Get the time at a glance, anytime, without touching your main battery or even the power button. This means more main battery life for apps, calls, and everything else. Ultimate Convenience: A "magical" interaction – just pick up your phone, and the time instantly appears. No taps, no button presses. Sustainable & Innovative: Showcases practical "energy harvesting" in a consumer device, pushing boundaries for self-sufficient tech. Extreme Energy Efficiency: By using a low-power accelerometer as the trigger and only lighting a few pixels on demand, the system is designed for minimal power draw, making kinetic power a viable source. This concept combines existing low-power sensing (accelerometer), efficient display technology (OLED/AMOLED's true blacks), and cutting-edge energy harvesting, creating a genuinely innovative user experience.
1
1
117
Jun ’25
Making PhotoLibrary UIImagePickerController a11y compliant
I am invoking the UIImagePickerController of type UIImagePickerControllerSourceTypePhotoLibrary from my viewController. I want shift the keyboard focus to the Cancel button which is the first interactive element on the gallery picker. When a user has full keyboard access turned on they should be able to tap tab and interact with the gallery picker modal. How do I achieve this?
1
0
153
May ’25
Handling Keyboard Hotkeys and Shortcuts across Multiple Languages
We have a requirement to manage the shortcuts and hotkeys in our application, and have it to be intuitive and support multi-lingual fully. The understanding that we have currently is that most universal shortcuts and hotkeys on MacOS/iOS are expressed using English/Latin characters’ – and now, when a ‘pure foreign language physical or virtual keyboard’ is the ‘input device’ – we are unclear how the user would invoke such a hotkey. Now, considering cases where other language keyboards have no Latin characters, in these environments, managing shortcuts and hotkeys becomes a rather difficult task. Taking a very simple example, the shortcut for Printing a page is Command/Control + 'P'. This can be an issue on Non English character keyboards like Arabic, where not only are there no letters for P, there is also no equivalent phonetic character as well, since the language itself does not have it. Also – when we are wanting customizability of a hotkey by the user, how would the user express ‘which is the key combination for a given action they want to perform’. So, based on these conditions, in order to provide the most comprehensive and optimal experience for the user in their own language, what is it that Apple recommend we do here, for Hotkeys/Shortcuts support in Pure Languages
1
0
418
2w