I am able to block apps using FamilyControl and Shield. Unblocking is also simple—just assign nil to store.shield.applications. However, I want to unblock them even when the app is not open.
Use case: Let's say the app allows users to create a session where a particular app is blocked for a specific duration. Once the session starts, the app should remain blocked, and as soon as the session time ends, it should automatically be unblocked.
Please help me with this. Thank you!
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi Apple engineering team,
I’m trying to integrate the new Live Caller ID Lookup (PIR) on iOS using your pir-service-example code as well as a custom mock server in Vapor, but the extension never advances past the /issue/token-key-for-user-token step. I’ve tried both:
1. Official Example
Cloned https://github.com/apple/pir-service-example
Ran PIRService locally
Confirmed that
GET /.well-known/private-token-issuer-directory → 200
GET /issue/token-key-for-user-token → 200 (DER bytes, correct SPKI)
No POST /issue ever fires
2. Mock Server (Vapor)
Implemented all five endpoints (/config, /.well-known/private-token-issuer-directory, /issue/token-key-for-user-token, /issue, /queries)
Verified with curl and openssl asn1parse that:
GET /.well-known/private-token-issuer-directory
Content-Type: application/private-token-issuer-directory
{ "issuer-request-uri":"https://…/issue", "token-keys":[…] }
GET /issue/token-key-for-user-token
Content-Type: application/octet-stream
<DER bytes>
Added Cache-Control: public, max-age=3600 on directory and SPKI
Stubbed POST /issue to always return { "token": "" }
Still no POST /issue request from the extension
Reproduction Steps
Install and enable a Live Lookup extension pointing to my server.
Trigger an incoming call on device.
Watch server logs—only see the two GETs, never /issue or /queries.
Expected Behavior
After fetching the SPKI DER, the framework should issue a POST /issue call (Privacy Pass flow) and then POST /queries.
Observed Behavior
Stuck in an infinite loop of:
GET /.well-known/private-token-issuer-directory
GET /issue/token-key-for-user-token
(repeat…)
No progression to the /issue or /queries endpoints.
What I’ve Tried
Verified JSON kebab-case and headers exactly match examples
Confirmed SPKI DER is valid via openssl asn1parse
Added Cache-Control headers
Tested on real device, localhost url, and ngrok public URL
Mocked a valid-looking token response
Could you advise what additional requirement or format detail I’m missing that prevents from advancing past /issue/token-key-for-user-token?
These are the main files:
LiveLookupExtension.swift
routes.swift
service-config.json
Thanks in advance!
I have been using the hourly weather forecast API, for some reason sometimes the API fails with 400 Bad Request, but on retrying just a minute later the call successfully returns data. The start and end time are 2 days apart so I don't think it's an issue with the time frame.
The failed calls also don't return any reason so not sure what is the exact failure.
Has anyone encountered this issue or knows why this might be happening??
Thanks!!
In the past I was always able to install every major macOS version on an external drive so that I can test my apps. But now I'm unable to install macOS Tahoe 26 on an external drive. Actually, as far as I'm aware, there are not even official links to macOS 26 installers, but only instructions on how to update to macOS 26 from an existing macOS installation. So I thought I'd install macOS 15 on a separate drive and then update to macOS 26, but whenever I run the macOS 15 installer, tell it to install on the external drive, and reboot after the setup process completes, my MacBook just boots into my main macOS partition as if nothing happened.
3 months ago I somehow managed to install macOS Tahoe beta 1 on an external drive, I don't remember how (but I don't think it was anything crazy); booting into that beta 1 partition and trying to update doesn't work either, as my MacBook again boots into my main macOS partition. I already asked help about the update problem one month ago here, but nobody replied.
Could someone at Apple please provide instructions on how one is supposed to install macOS 26 on an external drive (if possible before it becomes available to the public)? Are we supposed to buy a separate Mac for every macOS version that we want to test our apps on?
Weatherkit stopped again providing next hour rain data for United kingdom and ireland
I want to start the mail application from the development application, but the mail application made by Apple starts. It doesn't work even if I change the default email application to Outlook.
When you delete the Apple Mail app, Outlook starts. I want to automatically attach the generated pdf file to the email, but that doesn't work either.
Please tell me how to code.
Topic:
App & System Services
SubTopic:
General
How to uninstall/delete Voice Control on macOS so that I can test my app for the case when the initial use of Voice Control causes it to be downloaded from Apple? Is there a folder in the macOS System or Library to delete to force a re-download of Voice Control?
My macOS app uses the older NSSpeechRecognizer to handle speech commands, but to use NSSpeechRecognizer required authorization via [SFSpeechRecognizer requestAuthorization...]. I do this and on a macOS system it can trigger a download of Voice Control, the macOS feature. An alert appears with:
"A 390 MB download is required to use speech recognition features in MyApp. You may need to quit and open MyApp again after download completes."
I am still struggling to nail down the screen time between monitoring and showing it in a DeviceActivityReport. It's always off by a couple of percentage points, which results in a difference of a couple of minutes between the time shown for my total screen time in DeviceActivityReport and DeviceActivityMonitor with a threshold set for all apps/websites/categories.
In the report, I am looping through all segment (there is only 1 segement using .daily segment interval for a given day) then loop through all categories and all apps within each category and sum up all totalActivityDuration for each app. Based on avaiable documentation, that should corrolate to DeviceActivityMonitor threshold but it doesn't. Are there any differences in how these 2 places count screen time? Are there any apps/core ios services which are excluded from DeviceActivityMonitor. Would appreciate any help at all, I'm losing my mind here.
My current suspicion is that Apple Developer documentation is counted twice. i.e. this website https://developer.apple.com/documentation/deviceactivity/deviceactivitymonitor shows up in usage as an App with bundleId of apple.developer.wwdc-release and time spent there is counted twice, against this bundleId AND Safari. I don't know why it's not counted as a webdomain.
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 1).
1. I'm really excited about the new design system on all platforms. Liquid Glass is super cool. What do developers need to keep in mind when building for watchOS 26?
To adopt the new design system, start with updating your app for watchOS 10 – If you have done so, your app will be mostly ready for watchOS 26. For more information, see Design and build apps for WatchOS 10.
You can then look into Liquid Glass specific APIs to fine tune your app. This topic is covered in Adopting Liquid Glass.
If you have SwiftUI views using any custom style, make sure they are still legible and fit with the new design system.
2. Something that really stood out to me were updates to the Smart Stack, with the system prioritizing Widgets when they're most relevant. Tell me more about these new opportunities for apps.
Workout apps that record workouts using HealthKit may be automatically suggested on the watch face and appear in the Smart Stack without adding a widget.
Relevant widgets are a great way to present information related to a date, location, point-of-interest type, sleep schedule, or fitness condition in the Smart Stack when it is relevant. Relevant widgets don't need to display a empty state view when they are not relevant. They are only shown in the Smart Stack when relevant.
The watchOS 26 Design ToolKit in the Apple Design Resources includes a set of templates that you can use to layout your widgets.
3. Is the Wrist Flick gesture available to developers in the same way as Double Tap is?
The system uses Wrist Flick to dismiss notifications and incoming calls, silence timers and alarms, or return to the watch face. There is no separate API for the Wrist Flick gesture.
Apps that are using XCUIAutomation to make sure their user interface behaves as intended can use the XCUIDeviceHandGesture.flick to automate tests that verify that their app responds appropriately to the Wrist Flick gesture. For apps using automated testing, the XCUIDeviceHandGesture.doubleTap can be also be used to automate testing of the app with the Double Tap gesture. See XCUIDevice.perform(handGesture:)
4. Can HRV measurements be triggered on demand via API in watchOS? Guidelines or processes for enabling energy-intensive biometric sampling on development devices for IRB-approved research?
You don’t have direct control on the sampling rate in watchOS. You can use HealthKit (HKQuantityTypeIdentifierHeartRateVariabilitySDNN, to be specific) to query the HRV data, once the system has sampled and persisted the data to HealthKit. If that doesn’t help, we suggest that you file a feedback report with your concrete use case for us to investigate.
Specific to IRB-approved research using Apple Watch or its companion iPhone, you might want to look at this FAQ and SensorKit to see if they can be of any help.
5. What is the best advice for someone who is new to making a watchOS app that’s been on iOS and iPadOS?
You can start with exploring the system experience features on watchOS, such as notifications, controls, and widgets, and getting familiar with the system spaces, like Smart Stack, watch face, and control center.
Knowing the watchOS app design principles and practices is important as well. Design and build apps for WatchOS 10 is a great resource for this topic.
SwiftUI is an amazing across-platform framework, and you will use it to create your watchOS app. If you're already using it, great!
Keep in mind some watch-only constraints. Comparing to iPhone or iPad, Apple Watch has a limited battery and smaller screen size, which significantly impacts how people use your app and how your app works.
6. Was there any extension this year to the 7 day limit on querying Apple Health data on the watch?
There is no change on the limit this year. You can get this official limit at runtime using earliestPermittedSampleDate.
There are some exceptions, and so don't be surprised if you see some data types are retained longer.
The companion iPhone holds the full set of the health data. If you need to access the health data that has been purged from the Apple Watch, consider doing it with your iOS app, and then passing the result to your watchOS app.
My use case is a Repeat Timer app.
The user can configure a repeating timer, say:
8 minutes with 3 sets
So I would like to configure either 3 alarms (8mins, 16mins, 24mins) or a repeating alarm (8mins every 3 mins)
BUT...
I would like the first and second alarms to break through but only for 5 seconds for example, and then stop (so the user doesn't need to tap the screen to silence the alarm).
I don't think this is possible after reading the API docs, but am asking the question anyway.
Thanks for any advice or guidance here, and happy WWDC!
Topic:
App & System Services
SubTopic:
General
Hi,
I would like to reset system window private picker alert with ScreenCapture kit. i can reset the ScreenCapture permission with tccutil reset ScreenCapture. but it does not reset the system window private picker alert. i tried deleting the application directory from container and it does not help. the system window private picker alert uses the old approval i gave and it does not prompt a new alert. How can i starta with fresh screencapture kit settings for an app in testing?
Thanks
What I want to achieve now is that when the app is not running, upon receiving a notification, it displays an interface similar to CallKit with accept and decline buttons.
Here is part of my code:
@available(iOS 17.4, *)
class LiveCommunicationManager: NSObject, ConversationManagerDelegate {
static let shared = LiveCommunicationManager()
var isInvalidate:Bool = false
var configuration: ConversationManager!
override init() {
let config = ConversationManager.Configuration(
ringtoneName: "notes_of_the_optimistic",
iconTemplateImageData: UIImage(named: "AppIcon")?.pngData(), // 图标的 PNG 数据
maximumConversationGroups: 1, // 最大对话组数
maximumConversationsPerConversationGroup: 1, // 每个对话组内最大对话数
includesConversationInRecents: false, // 是否在通话记录中显示
supportsVideo: false, // 是否支持视频
supportedHandleTypes: [.generic,.phoneNumber,.emailAddress] // 支持的通话类型
)
configuration = ConversationManager.init(configuration: config)
}
func reportIncomingCall(uuid: UUID, callerName: String) {
configuration.delegate = self
let local = Handle(type: .generic, value: callerName, displayName: callerName)
let update = Conversation.Update(localMember: local,members: [local],activeRemoteMembers: [local])
Task{
do {
try await configuration.reportNewIncomingConversation(uuid: uuid, update: update)
print("成功报告新来电")
} catch {
print("报告新来电失败: \(error.localizedDescription)")
}
}
}
func conversationManager(_ manager: ConversationManager, conversationChanged conversation: Conversation) {
print("会话状态改变了")
}
func conversationManagerDidBegin(_ manager: ConversationManager) {
print("会话已经开始了")
manager.delegate = self
}
func conversationManagerDidReset(_ manager: ConversationManager) {
print("会话将要清除了")
}
func conversationManager(_ manager: ConversationManager, perform action: ConversationAction) {
print("会话接听了")
configuration.invalidate()
}
func conversationManager(_ manager: ConversationManager, timedOutPerforming action: ConversationAction) {
print("会话超时了")
}
func conversationManager(_ manager: ConversationManager, didActivate audioSession: AVAudioSession) {
print("会话激活了")
}
func conversationManager(_ manager: ConversationManager, didDeactivate audioSession: AVAudioSession) {
print("会话死亡了")
}
}
在Appdelegate里设置了这些:
func application(_ application: UIApplication,
didReceiveRemoteNotification userInfo: [AnyHashable: Any],
fetchCompletionHandler completionHandler: @escaping (UIBackgroundFetchResult) -> Void) {
// 在这里处理离线推送通知
completionHandler(.noData) // 返回后台任务完成
if let aps = userInfo["aps"] as? [String: Any],
let alert = aps["alert"] as? [String : Any]{
// 静默推送的处理逻辑
if #available(iOS 17.4, *) {
let manager = LiveCommunicationManager.shared
if manager.isInvalidate { return }
if let msgType = userInfo["msgType"] as? Int{
if msgType == 5{
manager.configuration.invalidate()
}else{
let callerName = alert["title"] as? String ?? "Fanvil"
manager.reportIncomingCall(uuid: UUID(), callerName: callerName)
}
}
}
}
}
Xcode has been configured with the necessary capabilities, such as Background Fetch, Voice over IP, Background Processing, and Push Notification.
The issue now is that sometimes the code works as expected, allowing the app to wake up when not running and displaying the system interface with accept and decline buttons. However, after a few successful attempts, the app stops waking up, and no notification appears. But when I manually open the app, the didReceiveRemoteNotification method gets triggered.
I’d like to know why this stops working after a few times.
I've been stuck for days trying to figure out how to extract the full text of a Siri prompt that launches my app. We need to be able to get the text of the full command, such as "Hey siri, buy dogfood...." so I can get "dogfood" or anything else following 'buy' . The examples I am finding are a) out of date or b) incomplelete. Right now we're using AppIntents with Shortcuts, but have to use dedicated shortcuts for each specific purchase, which are obviously very limiting.
Hi, I have a work task that I need to do that I struggle with...
We would like to sent an email to our clients where in that email there will be a link that will redirect to App Store to our app to be downloaded.
After the app is downloaded when the user opens the app he will automatically have a promocode pasted in our registration flow promocode text field.
Has anyone had similar task ? Is it even possible ? I did some research on this but I didn't find anyone in situation that I have now...
I now that deeplinks are widely used to parse data like this, but I'm not sure if this works even when the app is not downloaded yet. Maybe that the solution will be having a two links, but that would not be very user friendly...
Do you have any advice how to do something like this or some alternatives please ?
Thank you soo much for you time and respond.
Hi there
The behaviour of using Locale(identifier: "ar") with NumberFormatter.locale appears to have changed between iOS 17 and iOS 18.
Is this expected?
Steps to reproduce
import UIKit
func numberFormatter(withlocaleString localeString: String) -&gt; NumberFormatter {
let locale = Locale(identifier: localeString)
let numberFormatter = NumberFormatter()
numberFormatter.locale = locale
return numberFormatter
}
let numbers = 0...9
let localeDigits = numbers
let ar_digits = localeDigits.compactMap {
numberFormatter(withlocaleString: "ar").string(for: $0)?.first
}
print(ar_digits)
Results
The results show:
**** numbering system on iOS 17
latn numbering system on iOS 18.
iOS
Output
iOS 17
["٠", "١", "٢", "٣", "٤", "٥", "٦", "٧", "٨", "٩"]
iOS 18
["0", "1", "2", "3", "4", "5", "6", "7", "8", "9"]
Hi,
I would like to asking , can I setup a. alarm to alert when phone if OFF power ?
since we would like to design a timer with emergence alert. so I need a alert on even phone power is off ,
Thanks.
Hello Apple development team, I have developed an App for screen time management, which mainly uses ScreenTimeAPI. Users can set certain Apps to be disabled during a certain period of time.
After the App is released, users often report that the settings do not take effect as expected. I have seen many developers on the forum reporting that the DeviceActivityMonitor extension sometimes does not trigger callbacks. Based on this background, I have the following questions:
Is it a known problem that the DeviceActivityMonitor extension sometimes does not trigger callbacks? If so, are there any means to avoid or reduce the probability of occurrence?
In addition to being killed by the system when the running memory exceeds (I just called some ScreenTimeAPI and accessed UserDefaults in the extension, which should not exceed the running memory), under what other circumstances will the DeviceActivityMonitor extension be killed by the system? Will it automatically recover after being killed? Will some callbacks be called when killing?
Does ManagedSettingsStore have a life cycle? How do you avoid conflicts when configuring the underlying operating mechanism of multiple stores?
This is a random problem. I have never encountered it during development and debugging, but users often report it.
thanks
Topic:
App & System Services
SubTopic:
General
Tags:
Family Controls
Device Activity
Managed Settings
Hello,
I've been working recently on an access card type appllication, where due to requirements HCE must be used.
I started with examples from the official documentation and was able to achieve correct communication.
The HCE is working on newer versions of iOS, there's only one problem: acquiring new intent assertion. Due to the business requirements our users may need to use our application multiple times in short periods of time (below 10 seconds apart).
Is there any possibility to shorten the required 15 seconds before acquiring a new intent assertion?
I appreciate any suggestions.
I have an app with Message Filtering Extension enabled and I have been experiencing unusual behaviour.
When I send messages from local number to local number, the filtering is done correctly, but when I send messages from certain international number to my local number the messages are not filtered. I couldn't find any errors in Console.
I see that the normalisation is correct, is there any specifications for SMS from certain countries? Or a reason why Message Filtering is not activated when a SMS is received?
Hello everyone,
I’m working on an iOS app that uses the new DeviceActivity framework to monitor and report user screen‐time in an extension (DeviceActivityReportExtension). I need to persist my processed screen‐time data into a standalone SQLite database inside the extension, but I’m running into issues opening and writing to the database file.
Here’s what I’ve tried so far:
import UIKit
import DeviceActivity
import SQLite3
class DeviceActivityReportExtension: DeviceActivityReportExtension {
private var db: OpaquePointer?
override func didReceive(_ report: DeviceActivityReport) async {
// 1. Construct path in app container:
let containerURL = FileManager.default.containerURL(forSecurityApplicationGroupIdentifier: "group.com.mycompany.myapp")
let dbURL = containerURL?.appendingPathComponent("ScreenTimeReports.db")
// 2. Open database:
if sqlite3_open(dbURL?.path, &db) != SQLITE_OK {
print("❌ Unable to open database at \(dbURL?.path ?? "unknown path")")
return
}
defer { sqlite3_close(db) }
// 3. Create table if needed:
let createSQL = """
CREATE TABLE IF NOT EXISTS reports (
id INTEGER PRIMARY KEY AUTOINCREMENT,
date TEXT,
totalScreenTime DOUBLE
);
"""
if sqlite3_exec(db, createSQL, nil, nil, nil) != SQLITE_OK {
print("❌ Could not create table: \(String(cString: sqlite3_errmsg(db)))")
return
}
// 4. Insert data:
let insertSQL = "INSERT INTO reports (date, totalScreenTime) VALUES (?, ?);"
var stmt: OpaquePointer?
if sqlite3_prepare_v2(db, insertSQL, -1, &stmt, nil) == SQLITE_OK {
sqlite3_bind_text(stmt, 1, report.date.description, -1, nil)
sqlite3_bind_double(stmt, 2, report.totalActivityDuration)
if sqlite3_step(stmt) != SQLITE_DONE {
print("❌ Insert failed: \(String(cString: sqlite3_errmsg(db)))")
}
}
sqlite3_finalize(stmt)
}
}
However:
Path issues: The extension’s sandbox is separate from the app’s. I’m not sure if I can use the same App Group container, or if there’s a better location for an on‐extension database.
Entitlements: I’ve added the App Group (group.com.mycompany.myapp) to both the main app and the extension, but the file never appears, and I still get “unable to open database” errors.
My questions are:
How do I correctly construct a file URL for an SQLite file in a DeviceActivityReportExtension?
Is SQLite the recommended approach here, or is there a more “Apple-approved” pattern for writing data from a DeviceActivity extension?
Any sample code snippets, pointers to relevant Apple documentation, or alternative approaches would be greatly appreciated!