I rarely use the Shortcuts app, so it took me a while to notice that my app's app intents all show incorrectly on macOS 15. On macOS 14 and 13, they used to show correctly, but now it seems that all localized strings show the key rather than the localized value.
@available(iOS 16.0, macOS 13.0, *)
struct MyAppIntent: AppIntent {
static let title = LocalizedStringResource("key1", comment: "")
static let description = IntentDescription(LocalizedStringResource("key2", comment: ""))
...
}
In Localizable.xcstrings file I have defined all the strings, for instance I have associated key1 with the value Title, but while the Shortcuts app used to display Title, it now displays key1.
Is this a known issue or did something change in macOS 15 that would require me to update something?
General
RSS for tagDelve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I have a very basic App Intent extension in my macOS app that does nothing than accepting two parameters, but running it in Shortcuts always produces the error "The action “Compare” could not run because an internal error occurred.".
What am I doing wrong?
struct CompareIntent: AppIntent {
static let title = LocalizedStringResource("intent.compare.title")
static let description = IntentDescription("intent.compare.description")
static let openAppWhenRun = true
@Parameter(title: "intent.compare.parameter.original")
var original: String
@Parameter(title: "intent.compare.parameter.modified")
var modified: String
func perform() async throws -> some IntentResult {
return .result()
}
}
The App has the ability to use WebKit and display web pages and the ability to add phone numbers to CallDirectory at specific timing.
In this App, when the App is launched or when the Add Contact button on the web page is pressed,
CallDirectoryExtention is reloaded from the host app (WebKit-viewController), the phone number is retrieved from the server, and the entry is updated.
I would like to add a process to remove the already added phone number entry when a specific value is retrieved from the application server.
The specific process we wish to implement is as follows.
Step 1: Use URLsession to retrieve values from the application server. (ViewController)
Step 2: If the value is a specific value, call a Function that deletes the CallDirectoryExtention entry. (ViewController)
Step 3: Delete all entries for the registered phone numbers.
However, I am aware that I have to use reloadExtension() to call the CallDirectoryExtention process from ViewController on Step2.
if I do so, CallDirectoryHandler.beginRequest() will be processed, and I am wondering if it is not possible to execute only the Function that deletes the entry.
Is there a way to run only the Function that deletes the CallDirectoryExtention entry from the host app(viewController)?
On my MAC, I have a XPC server running as a daemon. It also checks the clients for codesigning requirements.
I have multiple clients(2 or more).
Each of these clients periodically(say 5 seconds) poll the XPC server to ask for a particular data.
I want to understand how the performance of my MAC will be affected when multiple XPC clients keep polling a XPC server.
Some time ago I read somewhere that one can get a file icon on iOS like this:
UIDocumentInteractionController(url: url).icons.last!)
but this always returns the following image for every file:
Today I tried the following, which always returns nil:
(try? url.resourceValues(forKeys: [.effectiveIconKey]))?.allValues[.effectiveIconKey] as? UIImage
Is there any way to get a file icon on iOS?
You can try the above methods in this sample app:
struct ContentView: View {
@State private var isPresentingFilePicker = false
@State private var url: URL?
var body: some View {
VStack {
Button("Open") {
isPresentingFilePicker = true
}
if let url = url {
Image(uiImage: UIDocumentInteractionController(url: url).icons.last!)
if let image = (try? url.resourceValues(forKeys: [.effectiveIconKey]))?.allValues[.effectiveIconKey] as? UIImage {
Image(uiImage: image)
} else {
Text("none")
}
}
}
.padding()
.fileImporter(isPresented: $isPresentingFilePicker, allowedContentTypes: [.data]) { result in
do {
let url = try result.get()
if url.startAccessingSecurityScopedResource() {
self.url = url
}
} catch {
preconditionFailure(error.localizedDescription)
}
}
}
}
we have three problem when using the push notification on Live Activity.
1. What is the specific callback strategy for the activityUpdates property in ActivityKit?
We found that in actual user scenarios, there is a probability that we may not receive callbacks. From the community experience, there are some resource optimization strategies that do not perform callbacks. From this perspective, the explanation is kind of vague. Is there any clear feedback to understand why callbacks are performed/not performed?
2.what is the specific description of the wake-up strategy, when background app receive Live Activity offline start Push?
From community experience, we can see that the system may wake up for a duration of 0-30s due to resource optimization strategies, or not wake up/not deal with it. Is there an official description of the wake-up strategy? or we also have to follow this description:
Wake up of apps using content-available pushes are heavily throttled. You can expect 1-2 wakeup per hour as a best case scenario in the hands of your users. so this cannot be assumed to be a reliable wake-up on demand mechanism for an app.
3 How can we determine user have selected (allow or always allow) of the Live Activity permission?
When we use real-time activity offline push, there are two system prompts in iOS:
the first prompt : allow and disallow real-time activity
the second prompt : always allow and disallow
Is there an interface that can directly determine which permission the user has chosen (allow/always allow)? (By the way, we can get disallow status).
At present, we haven't seen any interface in the official documentation/interface that can determine (allow/always allow). The difference here will affect the generation of Update Token. Without Update Token, we can not update our activity instance.
When we use the App Clip link to jump on an iOS17 device, the first click will not respond, and the App Clip pop-up window will not pop up until the second click. Proceed as follows:
After the iPhone clicks the link for the first time, or clicks the ‘Clear Experience Cache’ button in setting-->Developer-->App Clips Testing
On the Notes App on iPhone, click the link https://appclip.apple.com/id?p=com.example.naturelab.backyardbirds.Clip
There is no response when clicking the link for the first time, and the window will not pop up until the second click. Is this an iPhone problem? Looking forward to your reply
I want to display device activity reports for particular selected apps. for getting a daily basis app uses time. Now, what is happening? there are 10 apps selected from the family activity picker but some apps are displayed in the list. I need all 10 apps or more that I will choose from the family activity picker. The bellow code is used for fetching reports.
var body: some View {
VStack {
DeviceActivityReport(context, filter: filter)
}
}
bellow code is used for the filter
@State public var filter = DeviceActivityFilter()
init(selectedApps: Set<ApplicationToken>, selectedCategories: Set<ActivityCategoryToken>, selectedWebDomains: Set<WebDomainToken>) {
self.selectedApps = selectedApps
self.selectedCategories = selectedCategories
self.selectedWebDomains = selectedWebDomains
self.filter = DeviceActivityFilter(
segment: .daily(
during: Calendar.current.dateInterval(
of: .weekOfYear, for: .now
)!
),
users: .all,
devices: .init([.iPhone]),
applications: selectedApps,
categories: selectedCategories,
webDomains: selectedWebDomains
)
}
You can see we selected 3 apps from family activity picker but we getting 2 apps from DeviceActivityReport extension
following code is for device activity report extension
let context: DeviceActivityReport.Context = .totalActivity
// Define the custom configuration and the resulting view for this report.
let content: (ActivityReport) -> TotalActivityView
func makeConfiguration(representing data: DeviceActivityResults<DeviceActivityData>) async -> ActivityReport {
// Reformat the data into a configuration that can be used to create
// the report's view.
var res = ""
var list: [AppDeviceActivity] = []
let totalActivityDuration = await data.flatMap { $0.activitySegments }.reduce(0, {
$0 + $1.totalActivityDuration
})
for await d in data {
res += d.user.appleID!.debugDescription
res += d.lastUpdatedDate.description
for await a in d.activitySegments{
res += a.totalActivityDuration.formatted()
for await c in a.categories {
for await ap in c.applications {
if let apptoken = ap.application.token {
let appName = (ap.application.localizedDisplayName ?? "nil")
let bundle = (ap.application.bundleIdentifier ?? "nil")
let duration = ap.totalActivityDuration
let numberOfPickups = ap.numberOfPickups
let app = AppDeviceActivity(appToken: apptoken, id: bundle, displayName: appName, duration: duration, numberOfPickups: numberOfPickups)
list.append(app)
}
}
}
}
}
return ActivityReport(totalDuration: totalActivityDuration, apps: list)
}
Hello, I have been using the App-prefs:General&path=SOFTWARE_UPDATE_LINK URL in my application to navigate to system settings, and it worked as expected. However, after updating to iOS 18, it no longer works, and I haven't been able to find a replacement.
Is there any alternative solution or a different URL that works?
I also tried prefs:root=General&path=SOFTWARE_UPDATE_LINK, but it didn’t work either.
There are different kinds of screen-sharing applications, all using different APIs.
The API used by AnyDesk, for example, or TeamViewer, which doesn't require light signals.
I wonder if this is more appropriate for a corporate application, i.e. MDM,
A screen-sharing application could be created and validated by Apple to display no light signals, and which could access the user's screen whenever the person wanted to after an initial acceptance?
In other words, the user accepts to share his screen once, but won't be notified to accept the next time.
Or is this impossible on iOS?
I'd be honored to have some answers
I have two standalone app written for watchos (standalone). One to authenticate and one for connectivity to real-world devices. The connectivity app uses the authentication app before every action, Im testing this with two xcode projects I have created and tried different things ended up with this error.
authapp://authenticate?callback=linkingapp://callback
-[SPApplicationDelegate extensionConnection:openSystemURL:]:2418: URL with scheme "authapp" not supported
how to get the url scheme working? Tested this in simulator and real device. info.plist and AppDelegate files are placed in both apps.
Is there a way to create a Date constant from year, month and day? The only constructors that show up are .now and those based on some timeInterval. I'm trying to initialize some test data with known dates.
Hello everyone,
I’m running an Objective-C–based hybrid (native + web) shopping application and encountering a recurring issue on devices running iOS 18.1.1:
When the app launches, it only shows a white screen with the native frame visible—no web content loads.
Restarting or reinstalling the app doesn’t help. In one instance, toggling Airplane Mode on and off brought the app back to normal, but this workaround isn’t consistent.
There are no crash logs, so we can’t determine if it’s caused by a network error, cache conflict, or an OS-level bug.
So far, we have only seen this issue on iOS 18.1.1 devices. Because our app is a shopping platform, this significantly impacts users.
Questions:
Could this be related to a known bug or limitation in iOS 18.1.1?
Are there recommended diagnostic steps or workarounds for a hybrid app that shows a white screen without generating crash logs?
Which additional details (e.g., system logs, network traces, device configurations) might help isolate the cause?
Any insights or suggestions would be greatly appreciated. Thank you in advance!
Topic:
App & System Services
SubTopic:
General
When using CallKit in my flutter app audio(both mic and speaker) stop working. When not using call kit to answer calls the app work fine. I am using the flutter flutter_callkit_incoming to use callkit and flutter_webrtc for the telephony. Flutter_callkit_incoming has some boilerplate code code include sections to uncomment when using webrtc and I have seen multiple fixes to suggest to make sure the to configure sharedAudioSession before the callkit is sent. Neither of this approaches seemed to have worked.
I'm trying to build an app with a DeviceActivityMonitor extension that executes some code after 15 minutes. I can confirm that the extension is set up correctly and that intervalDidStart is executed, but for some reason the intervalDidEnd method never gets called. What I'm doing in both is just registering a local notification.
class DeviceActivityMonitorExtension: DeviceActivityMonitor {
let store = ManagedSettingsStore()
override func intervalDidStart(for activity: DeviceActivityName) {
createPushNotification(
title: "Session activated!",
body: ""
)
super.intervalDidStart(for: activity)
}
override func intervalDidEnd(for activity: DeviceActivityName) {
createPushNotification(
title: "Session ended",
body: ""
)
super.intervalDidEnd(for: activity)
}
private func createPushNotification(title: String, body: String) {
let content = UNMutableNotificationContent()
content.title = title
content.body = body
// Configure the recurring date.
var dateComponents = Calendar.current.dateComponents([.era, .year, .month, .day, .hour, .minute, .second], from: Date().addingTimeInterval(1.0))
dateComponents.calendar = Calendar.current
dateComponents.timeZone = TimeZone.current
// Create the trigger as a repeating event.
let trigger = UNCalendarNotificationTrigger(dateMatching: dateComponents, repeats: false)
let uuidString = UUID().uuidString
let request = UNNotificationRequest(identifier: uuidString, content: content, trigger: trigger)
// Schedule the request with the system.
let notificationCenter = UNUserNotificationCenter.current()
notificationCenter.add(request)
}
}
And this is the method that is starting the monitoring session:
@objc public static func startSession() -> String? {
// Calculate start and end times
let center = DeviceActivityCenter()
let minutes = 15
let startDate = Date().addingTimeInterval(1)
guard let endDate = Calendar.current.date(byAdding: .minute, value: minutes, to: startDate) else {
return "Failed to create end date?"
}
// Create date components and explicitly set the calendar and timeZone
let startComponents = Calendar.current.dateComponents([.era, .year, .month, .day, .hour, .minute, .second], from: startDate)
let endComponents = Calendar.current.dateComponents([.era, .year, .month, .day, .hour, .minute, .second], from: endDate)
// Create schedule
let schedule = DeviceActivitySchedule(
intervalStart: startComponents,
intervalEnd: endComponents,
repeats: false
)
print("Now", Date())
print("Start", startDate, startComponents)
print("End", endDate, endComponents)
print(schedule.nextInterval)
do {
// Use a consistent activity name for our simple implementation
let activity = DeviceActivityName("SimpleSession")
try center.startMonitoring(activity, during: schedule)
return nil
} catch {
return "Failed to start monitoring: \(error)"
}
}
I can confirm my dates & date components make sense with the 4 print statements. Here is the output:
Now 2025-02-12 04:21:32 +0000
Start 2025-02-12 04:21:33 +0000 era: 1 year: 2025 month: 2 day: 11 hour: 20 minute: 21 second: 33 isLeapMonth: false
End 2025-02-12 04:36:33 +0000 era: 1 year: 2025 month: 2 day: 11 hour: 20 minute: 36 second: 33 isLeapMonth: false
Optional(2025-02-12 04:21:33 +0000 to 2025-02-12 04:36:33 +0000)
I get the Session activated! notification but never get the Session ended notification. Half an hour later, I've tried debugging the DeviceActivityCenter by printing out the activities property and can see that it is still there. When I try to print out the nextInterval property on the schedule object i get from calling center.schedule(for:), it returns nil.
I'm running this on an iPhone 8 testing device with developer mode enabled. It has iOS 16.7.10. I'm totally lost as to how to get this to work.
The errors below pop up when I validate or distribute the mobile app; this error happens with the upgraded Xcode version. I have Xcode 16.2, and the minimum target deployment is 14.0. Please help me to resolve these errors. build was successful, I can launch my application from the simulator
Validation failed
Invalid Bundle. The bundle .app/Frameworks/hermes.framework does not support the minimum OS Version specified in the Info.plist. (ID: ee79350d-249e-4101-89fe-e41b2620f4d6)
Validation failed
Missing Info.plist value. A value for the key 'MinimumOSVersion' in bundle .app/Frameworks/hermes.framework is required. (ID: b0f58cd5-2c72-437b-98c9-1b2b4122c203)
Validation failed
Invalid MinimumOSVersion. Apps that only support 64-bit devices must specify a deployment target of 8.0 or later. MinimumOSVersion in ‘.app/Frameworks/hermes.framework' is ''. (ID: c2a6247f-21d6-438f-b52f-572425b7aa65)
Topic:
App & System Services
SubTopic:
General
After the app is put in background for sometime and brought in to foreground and the app crashes each time with a different thread stack entries but all of them states same exception reason.
Hi,
I'm trying to setup PIR service for live caller id lookup (in python but based on swift example: https://github.com/apple/live-caller-id-lookup-example). The swift example provides utilities for database setup and encryption, but I can't find any specification about which key is used for database encryption and how the ios system knows about this key in order to be able to construct the PIR requests.
So my question is how does the PIR service communicate the secret key to ios system or vice versa? (specific to the test environment, before onboarding)
I am having an app in which we are using NEHotspotHelper to detect wifi changes. when I try to kill the app and network change detected, we are seeing logs in NEHotspotHelper* module and after that we are able see the logs indicating app launched in background.
[Application] application(_:didFinishLaunchingWithOptions:), UIApplicationState: 2
But when I checked in the app stack, I couldn't find the app. Then I clicked icon to launch the app, it launched without splash screen as if it came from background.
Need your support to understand this is expected behaviour.
Topic:
App & System Services
SubTopic:
General
My app features two kinds of widgets, let's call them kind A and kind B.
I have both A and B widgets on my Home Screen. When I tap the button on widget A (associated with App Intent), I expect widget B to also reload.
However, if you call WidgetCenter.shared.reloadAllTimelines() inside the perform() method of the AppIntent, the timeline of widget B does not reload immediately. This issue only occurs on a physical device and is not consistently reproducible. On a simulator, however, widget B reloads as expected.
FB13152293