Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

Take correctly sized screenshots with ScreenCaptureKit
I've been using CGWindowListCreateImage which automatically creates an image with the size of the captured window. But SCScreenshotManager.captureImage(contentFilter:configuration:) always creates images with the width and height specified in the provided SCStreamConfiguration. I could be setting the size explicitly by reading SCWindow.frame or SCContentFilter.contentRect and multiplying the width and height by SCContentFilter.pointPixelScale , but it won't work if I want to keep the window shadow with SCStreamConfiguration.ignoreShadowsSingleWindow = false. Is there a way and what's the best way to take full-resolution screenshots of the correct size? import Cocoa import ScreenCaptureKit class ViewController: NSViewController { @IBOutlet weak var imageView: NSImageView! override func viewDidAppear() { imageView.imageScaling = .scaleProportionallyUpOrDown view.wantsLayer = true view.layer!.backgroundColor = .init(red: 1, green: 0, blue: 0, alpha: 1) Task { let windows = try await SCShareableContent.excludingDesktopWindows(false, onScreenWindowsOnly: true).windows let window = windows[0] let filter = SCContentFilter(desktopIndependentWindow: window) let configuration = SCStreamConfiguration() configuration.ignoreShadowsSingleWindow = false configuration.showsCursor = false configuration.width = Int(Float(filter.contentRect.width) * filter.pointPixelScale) configuration.height = Int(Float(filter.contentRect.height) * filter.pointPixelScale) print(filter.contentRect) let windowImage = try await SCScreenshotManager.captureImage(contentFilter: filter, configuration: configuration) imageView.image = NSImage(cgImage: windowImage, size: CGSize(width: windowImage.width, height: windowImage.height)) } } }
5
0
930
Oct ’25
Terrible performance when using MusicKit's Artwork with UIKit
I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image. The way I have it setup is really simple: extension MusicKit.Song { func imageURL(for cgSize: CGSize) -> URL? { return artwork?.url( width: Int(cgSize.width), height: Int(cgSize.height) ) } func localImage(for cgSize: CGSize) -> UIImage? { guard let url = imageURL(for: cgSize), url.scheme == "musicKit", let data = try? Data(contentsOf: url) else { return nil } return .init(data: data) } } Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these: 2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.} 2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7 2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted) My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller. I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX. SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
5
0
1.6k
Mar ’25
Issues After Switching to v3
I've been having issues with authorization after switching from v1 to v3, have tried some of the suggestions provided by someone in a reply to a post of mine. Have tried to reach out to Apple Support a few times as well, though I haven't received any support that has helped me to move forward. I have tested my token at https://jwt.io and I'm getting a "Signature Verified", tried multiple browsers in private/incognito mode, now when I try to sign into my Apple Account to test my player I am receiving an error that stats "There is a Problem Connecting. There May be an Issue with Your Network." (which is not the case). I have tried everything I can think of, I'm at a loss and would appreciate any help to get my project moving forward. This is what I am seeing in the browser developer console (using Firefox): Authorization failed: AUTHORIZATION_ERROR: Unauthorized MKError https://js-cdn.music.apple.com/musickit/v3/musickit.js:13 authorize https://js-cdn.music.apple.com/musickit/v3/musickit.js:28 asyncGeneratorStep$w https://js-cdn.music.apple.com/musickit/v3/musickit.js:28 _next https://js-cdn.music.apple.com/musickit/v3/musickit.js:28 media.mydomain.com:398:19 https://media.mydomain.com/:398
5
0
709
Feb ’25
iOS AUv3 extension: no Icon shown in host
Hi, I'm working on an AUv3 project. The app itself displays my icon. However the Auv3 extension does not display any icon in any host app (AUM, Drambo, etc.0). I thought that the extension would inherit the host app icon but that it does not appear to be the case. I tried to add the icon as a 1024x1024 file to the extension target and the update my extension plist file withe a CFBundleIconFile key but no luck either. It must surely be really easy. What am I missing? Thanks in advance for your help!
5
0
135
May ’25
Playing music with Musickit.js in Chrome and Firefox
I'm unable to play music using Musickit.js in the Chrome or Firefox browsers. Even using the apple guide here: https://js-cdn.music.apple.com/musickit/v1/index.html - I've added my Music Developer Token and song/album url, but it only works in Safari and not in Chrome or Firefox. I'm unsure if this is a global issue, or if there is something I need to do to enable playback in other browsers, but as it stands it's not working for me. Thanks for any help in advance!
5
0
966
Jan ’25
iOS Speech Error on Mobile Simulator (Error fetching voices)
I'm writing a simple app for iOS and I'd like to be able to do some text to speech in it. I have a basic audio manager class with a "speak" function: import Foundation import AVFoundation class AudioManager { static let shared = AudioManager() var audioPlayer: AVAudioPlayer? var isPlaying: Bool { return audioPlayer?.isPlaying ?? false } var playbackPosition: TimeInterval = 0 func playSound(named name: String) { guard let url = Bundle.main.url(forResource: name, withExtension: "mp3") else { print("Sound file not found") return } do { if audioPlayer == nil || !isPlaying { audioPlayer = try AVAudioPlayer(contentsOf: url) audioPlayer?.currentTime = playbackPosition audioPlayer?.prepareToPlay() audioPlayer?.play() } else { print("Sound is already playing") } } catch { print("Error playing sound: \(error.localizedDescription)") } } func stopSound() { if let player = audioPlayer { playbackPosition = player.currentTime player.stop() } } func speak(text: String) { let synthesizer = AVSpeechSynthesizer() let utterance = AVSpeechUtterance(string: text) utterance.voice = AVSpeechSynthesisVoice(language: "en-GB") synthesizer.speak(utterance) } } And my app shows text in a ScrollView: ScrollView { Text(self.description) .padding() .foregroundColor(.black) .font(.headline) .background(Color.gray.opacity(0)) }.onAppear { AudioManager.shared.speak(text: self.description) } However, the text doesn't get read out (in the simulator). I see some output in the console: Error fetching voices: Swift.DecodingError.dataCorrupted(Swift.DecodingError.Context(codingPath: [], debugDescription: "Invalid container metadata for _UnkeyedDecodingContainer, found keyedGraphEncodingNodeID", underlyingError: nil)). Using fallback voices. I'm probably doing something wrong here, but not sure what.
5
1
323
14h
Metal CIKernel instances with arbitrarily structured data arguments
Hi, In the iOS13 and macOS Catalina release notes it says: Metal CIKernel instances now support arguments with arbitrarily structured data. I've been trying to use this functionality in a CIKernel with mixed results. I'm particularly interested in passing data in the form of a dynamically sized array. It seems to work up to a certain size. Beyond the threshold excessive data is discarded and the kernel becomes unstable. I assume there is some kind of memory alignment issue going on, but I've tried various types in my array and always get a similar result. I have not found any documentation or sample code regarding this. It would be great to know how this is intended to work and what the limitations are. In the forums there are two similar unanswered questions about data arguments, so I'm sure there are a few out there with similar issues. Thanks! Michael
5
0
440
Oct ’25
How to consume video from an RTSP service?
Hi,It seems like it's pretty easy to consume HTTP Live Streaming content in an iOS app. Unfortunately, I need to consume media from an RTSP server. It seems to me that this is a very similar thing, and that all of the underpinnings for doing it ought to be present in iOS, but I'm having a devil of a time figuring out how to make it work without doing a lot of programming.For starters, I know that there are web-based services that can consume an RTSP stream and rebroadcast it as an HTTP Live Stream that can be easily consumed by the media players in iOS. This won't work for me because my application needs to function in an environment where there is no internet access (it's on a private Wifi network where the only other thing on the network is the device that is serving the RTSP stream).Having read everything I can get my hands on and exploring third-party and open-source solutions, I've compiled the following list of ideas:1. Using an iOS build of the open-source ffmpeg library, which supports RTSP, I've come up with a test app that can receive the RTSP packets, decode them, create UIImages out of the frames, and display those frames on-screen. This provides a crude player, but performance is poor, most likely because ffmpeg can't take advantage of any hardware acceleration. It also doesn't provide me with any way to integrate the video stream into AVFoundation, so I'm on my own as far as saving the stream to a file, transcoding it, etc.2. I know that the AVURLAsset class doesn't directly support the RTSP scheme. Since I have access to the undecoded RTSP packets via ffmpeg, I've thought it should be possible to implement RTSP support myself via a custom NSURLProtocol, essentially fooling AVFoundation into reading those packets as if they originated in a file. I'm not sure if this would work, since the raw packets coming from the RTSP server might lack the headers that would otherwise be present in data being read from a file. I'm not even sure if AVFoundation would recognize my custom protocol.3. If a protocol doesn't work, I've considered that I might be able to implement my own local HTTP Live Streaming server that converts the RTSP packets into an HTTP stream that the media players can read. This sounds like a terribly convoluted solution to the problem, at best, and very difficult at worst.4. Going back to solution (1), if I could speed up the decoding by using some iOS CoreVideo function instead of ffmpeg, this solution might be okay. However, I can't find any documentation for CoreVideo on iOS (Apple only documents it for OS X).5. I'm certainly willing to license a third-party solution if it works well and provides good performance. Unfortunately, everything I've found so far is pretty crummy and mostly just leverages ffmpeg and/or VLC. What is most disappointing to me is that nobody seems to be able or willing to provide a solution that neatly integrates with AVFoundation. I really want to make my RTSP stream available as an AVAsset so I can use it with AVFoundation players and other classes -- I don't want to build an app that relies on custom third-party code for everything.Any ideas, tips, advice would be greatly appreciated.Thanks,Frank
9
1
16k
Oct ’25
MacOS: AudioUnit packaged as .appex won't load when host app is sandboxed
Hi, I'm working on an audio mixing app, that comes with bundled audio units that provide some of the app's core functionality. For the next release of that app, we are planning to make two changes: make the app sandboxed package the bundled audio units as .appex bundles instead as .component bundles, so we don't need to take care of the installation at the correct spot in the file system When trying this new approach, we run into problems where [[AVAudioUnitEffect alloc] initWithAudioComponentDescription:] crashes when trying to load our audio unit with the exception: AVAEInternal.h:109 [AUInterface.mm:468:AUInterfaceBaseV3: (AudioComponentInstanceNew(comp, &_auv2)): error -10863 Our audio unit has the `sandboxSafe flag enabled, and loads fine when the host app is not sandboxed, so I'm guessing I got the bundle id/code signing requirements for the .appex correct. It seems, that my .appex isn't even loaded, and the system rejects it because of its metadata. Maybe there something wrong the Info.plist generated by Juice? "BuildMachineOSBuild" => "23H222" "CFBundleDisplayName" => "elgato_sample_recorder" "CFBundleExecutable" => "ElgatoSampleRecorder" "CFBundleIdentifier" => "com.iwascoding.EffectLoader.samplerecorderAUv3" "CFBundleName" => "elgato_sample_recorder" "CFBundlePackageType" => "XPC!" "CFBundleShortVersionString" => "1.0.0.0" "CFBundleSignature" => "????" "CFBundleSupportedPlatforms" => [ 0 => "MacOSX" ] "CFBundleVersion" => "1.0.0.0" "DTCompiler" => "com.apple.compilers.llvm.clang.1_0" "DTPlatformBuild" => "24C94" "DTPlatformName" => "macosx" "DTPlatformVersion" => "15.2" "DTSDKBuild" => "24C94" "DTSDKName" => "macosx15.2" "DTXcode" => "1620" "DTXcodeBuild" => "16C5032a" "LSMinimumSystemVersion" => "10.13" "NSExtension" => { "NSExtensionAttributes" => { "AudioComponents" => [ 0 => { "description" => "Elgato Sample Recorder" "factoryFunction" => "elgato_sample_recorderAUFactoryAUv3" "manufacturer" => "Manu" "name" => "Elgato: Elgato Sample Recorder" "sandboxSafe" => 1 "subtype" => "Znyk" "tags" => [ 0 => "Effects" ] "type" => "aufx" "version" => 65536 } ] } "NSExtensionPointIdentifier" => "com.apple.AudioUnit-UI" "NSExtensionPrincipalClass" => "elgato_sample_recorderAUFactoryAUv3" } "NSHighResolutionCapable" => 1 } Any ideas what I am missing?
4
0
455
Feb ’25
Is AVPlayerViewController not supported on macCatalyst?
When building an application that can be built on iOS using macCatalyst, a link error like the one below will occur. Undefined symbol: OBJC_CLASS$_AVPlayerViewController The AVPlayerViewController documentation seems to support macCatalyst, but what is the reality? [AVPlayerViewController](https://developer.apple.com/documentation/avkit/avplayerviewcontroller? language=objc) Each version of the environment is as follows. Xcode 16.2 macOS deployment target: macOS 10.15 iOS deployment target: iOS 13.0 Thank you for your support.
4
0
429
Mar ’25
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
4
0
921
Feb ’25
Raycasting VNFaceLandmarkRegion2D
Hello, Does anyone have a recipe on how to raycast VNFaceLandmarkRegion2D points obtained from a frame's capturedImage? More specifically, how to construct the "from" parameter of the frame's raycastQuery from a VNFaceLandmarkRegion2D point? Do the points need to be flipped vertically? Is there any other transformation that needs to be performed on the points prior to passing them to raycastQuery?
4
0
296
Sep ’25
How to use the SpeechDetector Module
I am trying to use SpeechDetector Module in Speech framework along with SpeechTranscriber. and it is giving me an error Cannot convert value of type 'SpeechDetector' to expected element type 'Array.ArrayLiteralElement' (aka 'any SpeechModule') Below is how I am using it let speechDetector = Speech.SpeechDetector() let transcriber = SpeechTranscriber(locale: Locale.current, transcriptionOptions: [], reportingOptions: [.volatileResults], attributeOptions: [.audioTimeRange]) speechAnalyzer = try SpeechAnalyzer(modules: [transcriber,speechDetector])
4
2
402
Aug ’25
AUv3 recent "Failed to find component with type..." frequent issues
I've been generating new Audio Unit Extension apps with Xcode 16 (and newer), and although they generally work initially, it is easy (although I'm not sure how to do it reliably) to cause the app to no longer be able to instantiate the audiounit. Generally the call to AVAudioUnit.findComponent fails and SimplePlayEngine hits the fatalError("Failed to find component with type...") In the most recent project, merely adding files to the extension (without making any use of them) caused it to go off the rails. If I "Archive" the app+plugin, there is no audio unit extension in the bundle. If I switch to the audiounit extension and build it it's fine. If I look at the build folder in Library/Developer/Xcode/project_folder the extension_name.appex is there. Any ideas? If I can coax an unmodified audio unit extension project to exhibit this behavior I'll attach it here. Right now what I have has code I don't want to share.
4
1
720
Jan ’25
After playing an HDR video on iPhone for a while, the HDR effect disappears and the screen brightness decrease
When i use AVPlayer to obtain the video frame CVPixelBufferRef of an HDR video, and use AVSampleBufferDisplayLayer to display it on the screen, after a period of time, the HDR video content and screen gradually darken, losing the HDR effect. Steps to reproduce: Create an AVPlayer to loop an HDR video, specify the video frame format as kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange Create a timer to get the video frame CVPixelBufferRef at 30 frames per second Use AVSampleBufferDisplayLayer to display CVPixelBufferRef on the screen Don't operate the phone, wait for a period of time (such as 40 minutes), the HDR effect disappears and the screen darkens Note: You need to use an iPhone device, iOS 18.5 and below operating system You need to ensure that the HDR video is played in a loop, that is, to ensure that the screen continues to display HDR content, wait for a period of time, depending on different devices, you need to wait for 20-40 minutes. In the iPhone Photos app,the same problem will occur after playing HDR video in a loop for a long time Expected Results: When rendering HDR content for a long time, it is guaranteed that there is always an HDR effect, and the HDR content and screen will not be darkened. Current Results: After about 20-40 minutes, the HDR effect disappears and the screen darkens.
4
0
620
Jul ’25