Automation & Scripting

RSS for tag

Learn about scripting languages and automation frameworks available on the platform to automate repetitive tasks.

Automation & Scripting Documentation

Posts under Automation & Scripting subtopic

Post

Replies

Boosts

Views

Activity

how to use a 3D short cut with the totally closed application
I'm soliciting you because I'm having a problem using the 3D short cut for my ios application in uikit in the AppDelegate file but it's impossible to redirect the route when the user has completely killed the application. It works as a background application. I'd like it to redirect to the searchPage search page when the application is fully closed and the user clicks on search with 3D touch. final class AppDelegate: UIResponder, UIApplicationDelegate { lazy var window: UIWindow? = { return UIWindow(frame: UIScreen.main.bounds) }() private let appDependencyContainer = Container() private let disposeBag = DisposeBag() var pendingDeeplink: String? private lazy var onboardingNavigationController: UINavigationController = { let navigationController = UINavigationController(nibName: nil, bundle: nil) navigationController.setNavigationBarHidden(true, animated: false) return navigationController }() private func handleShortcutItem(_ shortcutItem: UIApplicationShortcutItem) { guard let windowScene = UIApplication.shared.connectedScenes.first as? UIWindowScene, let window = windowScene.windows.first(where: { $0.isKeyWindow }), let rootVC = window.rootViewController else { DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) { [weak self] in self?.handleShortcutItem(shortcutItem) } return } if let presentedVC = rootVC.presentedViewController { presentedVC.dismiss(animated: !UIAccessibility.isReduceMotionEnabled) { [weak self] in self?.executeShortcutNavigation(shortcutItem) } } else { executeShortcutNavigation(shortcutItem) } } private func executeShortcutNavigation(_ shortcutItem: UIApplicationShortcutItem) { DispatchQueue.main.asyncAfter(deadline: .now() + 0.1) { [weak self] in guard let self = self else { return } switch shortcutItem.type { case ShortcutType.searchAction.rawValue: self.mainRouter.drive(to: .searchPage(.show), origin: AppRoutingOrigin()) case ShortcutType.playAction.rawValue: self.mainRouter.drive(to: .live(channel: Channel(), appTabOrigin: AppTabOrigin.navigation.rawValue), origin: AppRoutingOrigin()) case ShortcutType.myListHistoryAction.rawValue: self.mainRouter.drive(to: .myList(.history), origin: AppRoutingOrigin()) default: break } } } What I've tried: Adding delays with DispatchQueue.main.asyncAfter Checking for window availability and rootViewController Dismissing presented view controllers before navigation Environment: iOS 15+ Swift 6 Using custom router system (mainRouter) App supports both SwiftUI and UIKit Questions: What's the best practice for handling shortcuts on cold launch vs warm launch? How can I ensure the router is properly initialized before navigation?
0
0
139
Jul ’25
AppleScript
Here's my AppleScript: tell application "Finder" activate open application file "Messages.app" of folder "Applications" of folder "System" of startup disk end tell I just need the step to send the message making the script automatically send the message which has already been created. This step opens the completed iMessage ready to send . I want to send it without and keyboard usage. All that is needed is the step to send
0
0
57
Nov ’25
UI Testing and 'Allow Paste'
I am developing an app that allows the user to ask it to process the clipboard contents and do something with it. In developing a XC UI Test, I find the app stops while it waits for the user to give permission. That breaks the automation. I tried: let springboard = XCUIApplication(bundleIdentifier: "com.apple.springboard") let allowButton = springboard.buttons["Allow Paste"] But that does not work. Is there a way to tell the framework to automatically give the test permission to access the Paste clipboard (or to allow me to write tests to grant this)?
0
0
101
Nov ’25
Siri media search unable to provide keyword
Hi, I am developing a music app. We are using siri media search functionality for a while. We recently had a case where siri would not provide keyword for a search. When user speaks "Play Kid songs" (in Turkish, çocuk şarkıları çal), when I debug I see mediaSearch.mediaName is nil. When user speaks "Play Kids" (in Turkish, çocuklar çal) a keyword is given and we can search and play related song. Normally I would think that siri is somehow censoring the word "Kid". But when i try the same voice search in Spotify, I get a children song search result. I've read documentations and searched web but couldnt find any similar experience. What would be the cause, is there an extra setting for this kind of behaviour. What would be the cause or a different capability that Spotify can get a keyword out of this voice search but not us?
0
0
71
Nov ’25
oascript stop working for sending notification
I have a script that stop working since Tahoe migration. The script need to send a notification on the my mac. The issue is this line: osascript -e 'display notification "The task has been completed" with title "✅ My script"' It doesn't fail (return 0), nothing in outputs, but doesn't show anything. I check all my notification settings, but all seems to be right, I tested the configuration of iterm & script editor. And the global ettongs of the notifications:
0
0
89
Nov ’25
SetFocusFilterIntent app cannot be copied to another Mac
I have recently added a SetFocusFilterIntent target extension to my app which is a system utility which goes into the menu bar(Application is agent = YES). I have followed the approach in the WWDC22 video introducing Focus Intent and I have created an App Groups to being able to make the Extension to communicate with my main app, however from when I did this sometimes when I run the app I do get this log line: Couldn't read values in CFPrefsPlistSource<0x97cd34700> (Domain: group.xxx.xxx.MyApp, User: kCFPreferencesAnyUser, ByHost: Yes, Container: (null), Contents Need Refresh: Yes): Using kCFPreferencesAnyUser with a container is only allowed for System Containers, detaching from cfprefsd Despite this the Focus mode integration is working correctly on my development Mac. However I used to Archive the app and then Copy the app to my MacBook but when I do that now my other Mac cannot open the app and it is giving me an error. If I revert this change then I can bring the app back to my other Mac as usual following the procedure: Product -> Archive. Then from the archiver: Distribute App -> Copy App. After that I copy the app generated to the App folder of my other MacBook but it doesn't open anymore. During the archival phase now I am even getting this warning: MyAppFocus.appex is an ExtensionKit extension and must be embedded in the parent app bundle's Extensions directory, but is embedded in the parent app bundle's ../../../BuildProductsPath/Release/MyApp.app/Contents/Extensions directory. How can I solve this issue? If I rollback the commit related to this SetFocusFilterIntent new feature the app can be Copied and moved to the other Mac as before. Is this related to the extension or to the fact that I had to use this new entitlement: com.apple.security.application-groups ?
0
1
66
2w
Custom AppEntity nested dictionary
I am creating an AppIntent to be used with Shortcuts and I would like to return a flexible dictionary of values with nested structures. As far as I understand the custom AppEntity only uses the displayRepresentation to store a title and subtitle which are LocalizedStringResource. types. Although I can convert my dictionary into a string I found no way in shortcuts to be able to retrieve the original structure of it and inspect individual elements like in subsequent actions. Is there a way to do this? Thank you in advance Nick Karanatsios
0
0
105
2w
App 自动更新,导致App功能错乱。
当用户开启App自动更新后,自动更新App后,App有些功能会错乱。 1.有的会触发旧App代码功能,如旧版本有个选择框,在新版本选择框从界面移除了,但自动更新看,有的用户会还会触旧版本选择框的功能 2.数据错乱,如App录入数字6,发送到服务端变成5.4 现在发现这些问题。都是要把App删除,重新下载就可以。 请问要如何避免这样的问题。 App是有Objective-C,会不会与开启 BITCODE有关?
0
0
86
2w
Deliver/bundle entire Shortcut automations with an app
Is there any way of creating complete Shortcuts automations and bundling them with my app? Specifically, I would like the user to be able to Take a photo and open it with my app Or take a screenshot and open it with my app Of course I could offer a Share extension, but going through the Share menu and selecting my app there is time consuming for the user. I would like the user to be able to configure his or her action button such that it takes a new picture and opens it with my app right away. I can, of course, offer the respective App Shortcuts and let the user combine them into a pipeline with the Take Screenshot or Take Photo system actions. However, only power users would do this. Hence, I would like to bundle this complete pipeline with my app, such that the user just has to assign his/her Action Button to this pipeline if he/she wants to use this feature. How to go about this? I was thinking of exporting the shortcut into a file, bundling it with the app as a resource, and offering it via a Share action for the user to install it, or by sharing it on iCloud and adding the iCloud link to the UI of my app. What is the recommended approach?
0
0
53
1w
AppIntents EntityPropertyQuery, how does "Filter Entity where" work?
When you correctly implement EntityPropertyQuery on an AppEntity, Shortcuts will expose a "Find Entity" action that calls into entities(matching:mode:sortedBy:limit:). This is demoed in the "Dive into App Intents" session and works as expected. However, with this action, you can change the "All Entity" input to a list variable which changes the action text from "Find All Entity" to "Filter Entity where" still giving you the same filter, sort and limit options. This appears to work as expected too. But, what's unexpected is that this filter action does not appear to call any method on my AppEntity code. It doesn't call entities(matching:mode:sortedBy:limit:). One would think there would need to be a filter(entities:matching:mode:sortedBy:limit:) to implement this functionality. But Shortcut just seems to do it all on it's own. I'm mostly wondering, how is this even working? Here's some example code: import AppIntents let books = [ BookEntity(id: 0, title: "A Family Affair"), BookEntity(id: 1, title: "Atlas of the Heart"), BookEntity(id: 2, title: "Atomic Habits"), BookEntity(id: 3, title: "Memphis"), BookEntity(id: 4, title: "Run Rose Run"), BookEntity(id: 5, title: "The Maid"), BookEntity(id: 6, title: "The Match"), BookEntity(id: 7, title: "Where the Crawdads Sing"), ] struct BookEntity: AppEntity, Identifiable { static var typeDisplayRepresentation: TypeDisplayRepresentation = "Book" var displayRepresentation: DisplayRepresentation { DisplayRepresentation(title: "\(title)") } static var defaultQuery = BookQuery() var id: Int @Property(title: "Title") var title: String init(id: Int, title: String) { self.id = id self.title = title } } struct BookQuery: EntityQuery { func entities(for identifiers: [Int]) async throws -> [BookEntity] { return identifiers.map { id in books[id] } } } extension BookQuery: EntityPropertyQuery { static var properties = QueryProperties { Property(\BookEntity.$title) { EqualToComparator { str in { book in book.title == str } } ContainsComparator { str in { book in book.title.contains(str) } } } } static var sortingOptions = SortingOptions { SortableBy(\BookEntity.$title) } func entities( matching comparators: [(BookEntity) -> Bool], mode: ComparatorMode, sortedBy: [Sort<BookEntity>], limit: Int? ) async throws -> [BookEntity] { books.filter { book in comparators.allSatisfy { comparator in comparator(book) } } } } The example Shortcut first invokes entities(matching:mode:sortedBy:limit:) with comparators=[], sortedBy=[], limit=nil to fetch all Book entities. Next the filter step correctly applies the title contains filter but never calls entities(matching:mode:sortedBy:limit:) or even the body of the ContainsComparator. But the output is correctly filtered.
1
0
1.6k
Jul ’25
Shortcut to Send Address to Tesla Navigation
Hi, new to this forum. Recently discovered how to share a location in Maps app with my Tesla to automatically start navigating. How cool is that! Being the nerd that I am, I wrote a shortcut to select a contact and share it's address with my Tesla. That way, I don't leave the Maps app in memory to use up my battery, and don't have to go to all the trouble of swiping Maps out of memory. JK. Anyway, when I share the shortcut-selected address with the Tesla, it says "Error this content could not be shared". To me this means the address as shared by the shortcut is not in the same format as when you share it directly from Maps. So the question is, how can I send a properly formatted location from my shortcut? Thanks...
1
1
1.7k
Nov ’25
Help with app automation permissions
Hi, I am trying to make an app that uses Spotify's web API to play songs. For the web API to work, Spotify needs to be running, and my Mac has to be recognized as an active device. For my Mac to be recognized as an active device, I have to play a song for a very short amount of time (under a second). I want to make my app automatically do that on launch. I already wrote the AppleScript in Automator, and it worked. It successfully launched Spotify, played a song for 0.5 seconds, then hid itself. After writing the code, I tried to implement it into my app to run on startup, but I ran into a problem. The app only started the Spotify app on my mac, and gave me an error that told me it wasn't running. What do I do? Is this an issue with the permissions of the app, or something else? I have given the app the "Apple Events" entitlement. This is the error I am getting. Note that the app opens Spotify, after which it gives me this. Error: { NSAppleScriptErrorAppName = Spotify; NSAppleScriptErrorBriefMessage = "Application isn\U2019t running."; NSAppleScriptErrorMessage = "Spotify got an error: Application isn\U2019t running."; NSAppleScriptErrorNumber = "-600"; NSAppleScriptErrorRange = "NSRange: {31, 8}"; } This is the function I am trying to use to do the actions with Spotify: func runAppleScript() { let appleScript = """ tell application "Spotify" activate if player state is not playing then play track "spotify:track:5XSKC4d0y0DfcGbvDOiL93" delay 1 pause end if end tell tell application "System Events" tell process "Spotify" set frontmost to true delay 1 keystroke "h" using {command down} end tell end tell """ var error: NSDictionary? if let scriptObject = NSAppleScript(source: appleScript) { scriptObject.executeAndReturnError(&error) } if let error = error { print("Error: \(error)") } } Any help is appreciated. Thank you in advance.
1
0
445
Jan ’25
Check to make sure iTunes is playing - AppleEvent timed out. (-1712)
Hi folks, I've got some music that I want playing on iTunes all the time on an older system, but it'll sometimes stop. I tried making a Applescript to check and play the music/playlist again if it stops, but I keep getting a timeout error. This is the AppleScript: repeat tell application "iTunes" if player state is paused then tell application "iTunes" to play end if delay 30 end tell end repeat I get this error: AppleEvent timed out. iTunes got an error: AppleEvent timed out. (-1712) I can't figure out why I'm getting a timeout error... anyone have any ideas?
1
0
394
Feb ’25
"Not authorized to send Apple events to Terminal
We are trying to open an application "xyz.app" It worked fine until 15.1.1 versions. But facing issues with 15.2 and 15.3 The application is working fine when we navigate to xyz.app/Contents/MacOS/ and run applet in this directory. But the error "Not authorized to send Apple events to Terminal" occurs when we are trying to open the app directly. We have tried with all the available solutions like giving full disk access to terminal and application, adding my application to automation in privacy and security tabs in settings. Any help would be appreciated. Thanks!
1
0
492
Feb ’25