Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics
Posts under Media Technologies topic

Post

Replies

Boosts

Views

Activity

"Conduct marketing and request technological support."
A few months ago, I had the opportunity to receive a 2018 iMac, and I’ve been using it to create content for my social media. I was truly impressed by the power of its processors. Even with this older model, I’ve been able to grow my presence online—something I couldn’t achieve with newer computers from other brands that I previously purchased. I would love to become a promoter of your brand in the gaming world. All I ask for is technological support with more recent equipment and a minimal payment for collaborating with you. I am genuinely interested in being part of your company and leveraging the potential and reputation of Apple to reach even greater heights.
1
0
158
Mar ’25
Populating Now Playing with Objective-C
Hello. I am attempting to display the music inside of my app in Now Playing. I've tried a few different methods and keep running into unknown issues. I'm new to Objective-C and Apple development so I'm at a loss of how to continue. Currently, I have an external call to viewDidLoad upon initialization. Then, when I'm ready to play the music, I call playMusic. I have it hardcoded to play an mp3 called "1". I believe I have all the signing set up as the music plays after I exit the app. However, there is nothing in Now Playing. There are no errors or issues that I can see while the app is running. This is the only file I have in Xcode relating to this feature. Please let me know where I'm going wrong or if there is another object I need to use! #import <Foundation/Foundation.h> #import <UIKit/UIKit.h> #import <MediaPlayer/MediaPlayer.h> #import <AVFoundation/AVFoundation.h> @interface ViewController : UIViewController <AVAudioPlayerDelegate> @property (nonatomic, strong) AVPlayer *player; @property (nonatomic, strong) MPRemoteCommandCenter *commandCenter; @property (nonatomic, strong) MPMusicPlayerController *controller; @property (nonatomic, strong) MPNowPlayingSession *nowPlayingSession; @end @implementation ViewController - (void)viewDidLoad { [super viewDidLoad]; NSLog(@"viewDidLoad started."); [self setupAudioSession]; [self initializePlayer]; [self createNowPlayingSession]; [self configureNowPlayingInfo]; NSLog(@"viewDidLoad completed."); } - (void)setupAudioSession { AVAudioSession *audioSession = [AVAudioSession sharedInstance]; NSError *setCategoryError = nil; if (![audioSession setCategory:AVAudioSessionCategoryPlayback error:&setCategoryError]) { NSLog(@"Error setting category: %@", [setCategoryError localizedDescription]); } else { NSLog(@"Audio session category set."); } NSError *activationError = nil; if (![audioSession setActive:YES error:&activationError]) { NSLog(@"Error activating audio session: %@", [activationError localizedDescription]); } else { NSLog(@"Audio session activated."); } } - (void)initializePlayer { NSString *soundFilePath = [NSString stringWithFormat:@"%@/base/game/%@",[[NSBundle mainBundle] resourcePath], @"bgm/1.mp3"]; if (!soundFilePath) { NSLog(@"Audio file not found."); return; } NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; self.player = [AVPlayer playerWithURL:soundFileURL]; NSLog(@"Player initialized with URL: %@", soundFileURL); } - (void)createNowPlayingSession { self.nowPlayingSession = [[MPNowPlayingSession alloc] initWithPlayers:@[self.player]]; NSLog(@"Now Playing Session created with players: %@", self.nowPlayingSession.players); } - (void)configureNowPlayingInfo { MPNowPlayingInfoCenter *infoCenter = [MPNowPlayingInfoCenter defaultCenter]; CMTime duration = self.player.currentItem.duration; Float64 durationSeconds = CMTimeGetSeconds(duration); CMTime currentTime = self.player.currentTime; Float64 currentTimeSeconds = CMTimeGetSeconds(currentTime); NSDictionary *nowPlayingInfo = @{ MPMediaItemPropertyTitle: @"Example Title", MPMediaItemPropertyArtist: @"Example Artist", MPMediaItemPropertyPlaybackDuration: @(durationSeconds), MPNowPlayingInfoPropertyElapsedPlaybackTime: @(currentTimeSeconds), MPNowPlayingInfoPropertyPlaybackRate: @(self.player.rate) }; infoCenter.nowPlayingInfo = nowPlayingInfo; NSLog(@"Now Playing info configured: %@", nowPlayingInfo); } - (void)playMusic { [self.player play]; [self createNowPlayingSession]; [self configureNowPlayingInfo]; } - (void)pauseMusic { [self.player pause]; [self configureNowPlayingInfo]; } @end
2
0
573
Feb ’25
Notification interruptions
My app Balletrax is a music player for people to use while they teach ballet. Used to be you could silence notifications during use, but now the customer seems to have to know how to use Focus mode, remember to turn it on and off, and have to check the notifications one does and doesn't want to use. Is there no way to silence all notifications when the app is in use?
0
0
65
Apr ’25
AVPlayer error: Too many open files
For some users in production, there's a high probability that after launching the App, using AVPlayer to play any local audio resources results in the following error. Restarting the App doesn't help. issue: [error: Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSLocalizedFailureReason=发生未知错误(24), NSLocalizedDescription=这项操作无法完成, NSUnderlyingError=0x30311f270 {Error Domain=NSPOSIXErrorDomain Code=24 "Too many open files"}} I've checked the code, and there aren't actually multiple AVPlayers playing simultaneously. What could be causing this?
0
0
424
Feb ’25
[Fairplay Streaming] Time Manipulation Detection DRM License Servers
Our team conducted security testing and found one vulnerability with fairplay license acquisition. Our QA engineer manually changed the device's system date and time (setting it 4 days into the future) and was able to successfully obtain a license response and initiate playback on an iOS device. However, on an Android device, the license acquisition failed. Can you please tell us if Time Manipulation Detection is available in FairPlay SDK?
0
0
425
Feb ’25
AVSpeechSynthesizer & Bluetooth Issues
Hello, I have a CarPlay Navigation app and utilize the AVSpeechSynthesizer to speak directions to a user. Everything works great on my CarPlay simulator as well as when plugged into my GMC truck. However, I found out yesterday that one of my users with a Ford truck the audio would cut in an out. After much troubleshooting, I was able to replicate this on my own truck when using Bluetooth to connect to CarPlay. My user was also utilizing Bluetooth. Has anyone else experienced this? Is there a fix to the problem? import SwiftUI import AVFoundation class TextToSpeechService: NSObject, ObservableObject, AVSpeechSynthesizerDelegate { private var speechSynthesizer = AVSpeechSynthesizer() static let shared = TextToSpeechService() override init() { super.init() speechSynthesizer.delegate = self } func configureAudioSession() { speechSynthesizer.delegate = self do { try AVAudioSession.sharedInstance().setCategory(.playback, mode: .voicePrompt, options: [.mixWithOthers, .allowBluetooth]) } catch { print("Failed to set audio session category: \(error.localizedDescription)") } } func speak(_ text: String) { Task(priority: .high) { let speechUtterance = AVSpeechUtterance(string: text) speechUtterance.voice = AVSpeechSynthesisVoice(language: AVSpeechSynthesisVoice.currentLanguageCode()) try AVAudioSession.sharedInstance().setActive(true, options: .notifyOthersOnDeactivation) speechSynthesizer.speak(speechUtterance) } } func speechSynthesizer(_ synthesizer: AVSpeechSynthesizer, didFinish utterance: AVSpeechUtterance) { Task { stopSpeech() try AVAudioSession.sharedInstance().setActive(false) } } func stopSpeech() { speechSynthesizer.stopSpeaking(at: .immediate) } }
0
0
434
Feb ’25
Apple TV HDMI Connected device turn off detection problem via HDMI
Hello, we have HLS Stream app on Apple TV. Our streams are DRM protected. We have problem with streams when source device is turned off. For example, user start to watch our HLS DRM Protected content. After some time, user turns off device (it can be Monitor or TV via connected HDMI). Our app does not understand HDMI Source device turned off. Is there any way to understand HDMI connected device is turned off on Swift?
0
0
421
Jan ’25
PHPhotoLibrary.shared().performChanges raised error 3303
I want to modify the photo's exif information, which means putting the original image in through CGImageDestinationAddImageFromSource, and then adding the modified exif information to a place called properties. Here is the complete process: static func setImageMetadata(asset: PHAsset, exif: [String: Any]) { let options = PHContentEditingInputRequestOptions() options.canHandleAdjustmentData = { (adjustmentData) -> Bool in return true } asset.requestContentEditingInput(with: options, completionHandler: { input, map in guard let inputNN = input else { return } guard let url = inputNN.fullSizeImageURL else { return } let output = PHContentEditingOutput(contentEditingInput: inputNN) let adjustmentData = PHAdjustmentData(formatIdentifier: AppInfo.appBundleId(), formatVersion: AppInfo.appVersion(), data: Data()) output.adjustmentData = adjustmentData let outputURL = output.renderedContentURL guard let source = CGImageSourceCreateWithURL(url as CFURL, nil) else { return } guard let dest = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.jpeg.identifier as CFString, 1, nil) else { return } CGImageDestinationAddImageFromSource(dest, source, 0, exif as CFDictionary) let d = CGImageDestinationFinalize(dest) // d is true, and I checked the content of outputURL, image has been write correctly, it could be convert to UIImage and image is ok. PHPhotoLibrary.shared().performChanges { let changeReq = PHAssetChangeRequest(for: asset) changeReq.contentEditingOutput = output } completionHandler: { succ, err in if !succ { print(err) // 3303 here, always! } } }) }
1
0
391
Feb ’25
SystemAudio Capture API Fails with OSStatus error 1852797029 (kAudioCodecIllegalOperationError)
Issue Description I'm implementing a system audio capture feature using AudioHardwareCreateProcessTap and AudioHardwareCreateAggregateDevice. The app successfully creates the tap and aggregate device, but when starting the IO procedure with AudioDeviceStart, it sometimes fails with OSStatus error 1852797029. (The operation couldn’t be completed. (OSStatus error 1852797029.)) The error occurs inconsistently, which makes it particularly difficult to debug and reproduce. Questions Has anyone encountered this intermittent "nope" error code (0x6e6f7065) when working with system audio capture? Are there specific conditions or system states that might trigger this error sporadically? Are there any known workarounds for handling this intermittent failure case? Any insights or guidance would be greatly appreciated. I'm wondering if anyone else has encountered this specific "nope" error code (0x6e6f7065) when working with system audio capture.
0
0
140
May ’25
CoreMediaErrorDomain error -12927 with HEVC and DRM
I can't play video content with HEVC and DRM. Tested HEVC only: OK. Tested DRM+AVC: Ok. Tested 2 players (Clappr/Stevie and BitMovin) Master, variants and EXT-X-MAPs are downloaded Ok, DRM keys Ok and then, for instance with BitMovin Player: [BMP] [Player] [Error] Event: SourceError, Data: {"code":2001,"data":{"message":"The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","code":-12927},"message":"Source Error. The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.)","timestamp":1740320663.4505711,"type":"onSourceError"} code: 2001 [Data code: -12927, message: The operation couldn’t be completed. (CoreMediaErrorDomain error -12927.), underlying error: Error Domain=CoreMediaErrorDomain Code=-12927 "(null)"] 4k-master.m3u8.txt 4k.m3u8.txt 4k-audio.m3u8.txt
1
0
542
Feb ’25
Media Library Access not working in my App since iOS 18.2
Hi There, I have an app which access the media library, to save and load files. Since the IOS 18.2, the access to the media library stopped working. Now, I've noticed that our App doesn't show in the List of apps with access to Files ( Privacy & Security -> Files & Folders). Weird behavior is that, one iPhone with iOS 18.3.1 can access to the Files but others no, same iOS version 18.3.1. Test on Simulators (MAC) and works fine also. My info.plist file have the keys to access media library for long time and hasn't changed (at least in the las 4 years) including the key "Privacy - Media Library Usage Description". Also, I've noticed, that the message (popup) that request access to the media library, when using the app for the first time, doesn't show up anymore. We request access to the network (wifi) and this message still showing up but no the media library. I'm using Visual Studio with Xamarin on a MAC. I really appreciate any help you can because is very odd behavior and this started from the iOS 18.2.
0
0
392
Feb ’25
PhotosPicker obtains the result information of the selected video
In the AVP project, a selector pops up, only wanting to filter spatial videos. When selecting the material of one of the spatial videos, the selection result returns empty. How can we obtain the video selected by the user and get the path and the URL of the file The code is as follows: PhotosPicker(selection: $selectedItem, matching: .videos) { Text("Choose a spatial photo or video") } func loadTransferable(from imageSelection: PhotosPickerItem) -> Progress { return imageSelection.loadTransferable(type: URL.self) { result in DispatchQueue.main.async { // guard imageSelection == self.imageSelection else { return } print("加载成功的图片集合:(result)") switch result { case .success(let url?): self.selectSpatialVideoURL = url print("获取视频链接:(url)") case .success(nil): break // Handle the success case with an empty value. case .failure(let error): print("spatial错误:(error)") // Handle the failure case with the provided error. } } } }
4
0
134
May ’25
Changing instrument with AVMIDIControlChangeEvent bankSelect
I've been trying to use AVMIDIControlChangeEvent with a bankSelect message type to change the instrument the sequencer uses on a AVMusicTrack with no luck. I started with the Apple AVAEMixerSample, converting the initial setup/loading and portions dealing with the sequencer to Swift. I got that working and playing the "bluesyRiff" and then modified it to play individual notes. So my createAndSetupSequencer looked like func createAndSetupSequencer() { sequencer = AVAudioSequencer(audioEngine: engine) // guard let midiFileURL = Bundle.main.url(forResource: "bluesyRiff", withExtension: "mid") else { // print (" failed guard trying to get URL for bluesyRiff") // return // } let track = sequencer.createAndAppendTrack() var currTime = 1.0 for i: UInt32 in 0...8 { let newNoteEvent = AVMIDINoteEvent(channel: 0, key: 60+i, velocity: 64, duration: 2.0) track.addEvent(newNoteEvent, at: AVMusicTimeStamp(currTime)) currTime += 2.0 } The notes played, so then I also replaced the gs_instruments sound bank with GeneralUser GS MuseScore v1.442 first by trying guard let soundBankURL = Bundle.main.url(forResource: "GeneralUser GS MuseScore v1.442", withExtension: "sf2") else { return} do { try sampler.loadSoundBankInstrument(at: soundBankURL, program: 0x001C, bankMSB: 0x79, bankLSB: 0x08) } catch{.... } This appears to work, the instrument (8 which is "Funk Guitar") plays. If I change to bankLSB: 0x00 I get the "Palm Muted guitar". So I know that the soundfont has these instruments Stuff goes off the rails when I try to change the instruments in createAndSetupSequencer. Putting let programChange = AVMIDIProgramChangeEvent(channel: 0, programNumber: 0x001C) let bankChange = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.bankSelect, value: 0x00) track.addEvent(programChange, at: AVMusicTimeStamp(1.0)) track.addEvent(bankChange, at: AVMusicTimeStamp(1.0)) just before my add note loop doesn't produce any change. Loading bankLSB 8 (Funk) in sampler.loadSoundBankInstrument and trying to change with bankSelect 0 (Palm muted) in createAndSetupSequencer results in instrument 8 (Funk) playing not Palm Muted. Loading bankLSB 0 (Palm muted) and trying to change with bankSelect 8 (Funk) doesn't work, 0 (Palm muted) plays I also tried sampler.loadInstrument(at: soundBankURL) and then I always get the first instrument in the sound font file (piano)no matter what values I put in my programChange/bankChange I've also changed the time in the track.addEvent to be 0, 1.0, 3.0 etc to no success The sampler.loadSoundBankInstrument specifies two UInt8 parameters, bankMSB and BankLSB while the AVMIDIControlChangeEvent bankSelect value is UInt32 suggesting it might be some combination of bankMSB and BankLSB. But the documentation makes no mention of what this should look like. I tried various combinations of 0x7908, 0X0879 etc to no avail I will also point out that I am able to successfully execute other control change events For example adding if i == 1 { let portamentoOnEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamento, value: 0xFF) track.addEvent(portamentoOnEvent, at: AVMusicTimeStamp(currTime)) let portamentoRateEvent = AVMIDIControlChangeEvent(channel: 0, messageType: AVMIDIControlChangeEvent.MessageType.portamentoTime, value: 64) track.addEvent(portamentoRateEvent, at: AVMusicTimeStamp(currTime)) } does produce a change in the sound. (As an aside, a definition of what portamento time is, other than "the rate of portamento" would be welcome. is it notes/seconds? freq/minute? beats/hour?) I was able to get the instrument to change in a different program using MusicPlayer and a series of MusicTrackNewMIDIChannelEvent on a track but these operate on a MusicTrack not the AVMusicTrack which the sequencer uses. Has anyone been successful in switching instruments through an AVMIDIControlChangeEvent or have any feedback on how to do this?
0
0
352
Mar ’25
AVAudioMixerNode outputVolume range?
According to the header file the outputVolume properties supported range is 0.0-1.0: /*! @property outputVolume @abstract The mixer's output volume. @discussion This accesses the mixer's output volume (0.0-1.0, inclusive). @property (nonatomic) float outputVolume; However when setting the volume to 2.0 the audio does indeed play louder. Is the header file out of date and if so, what is the supported range for outputVolume? Thanks
0
0
83
Apr ’25
Apple News Publisher: How To Successfully Apply
Hi there, Can anyone tell me how to possibly get approved as an Apple News Publisher in 2025? We attempted in 2024, but received this message from Apple support: "Thank you for your interest in Apple News. At this time, we're not accepting new applications." When I inquired further, I got this second response: "Apple News is no longer accepting unsolicited applications. To learn more about Apple News requirements, visit the Apple News support page. If you have any feedback, please use this form to send us your comments. Keep in mind that while we read all feedback, we are unable to respond to each submission individually." My questions are: Is this still the case? (Especially when you are a legit local news outlet) Is there a link to apply as a news publisher? I don't seem to have that option at all. Thanks for any feedback.
0
0
83
Apr ’25
Playing audio live from Bluetooth headset on iPhone speaker
Hi guys, I am having issue in live-streaming audio from Bluetooth headset and playing it live on the iPhone speaker. I am able to redirect audio back to the headset but this is not what I want. The issue happens when I am trying to override output - the iPhone switches to speaker but also switches a microphone. This is example of the code: import AVFoundation class AudioRecorder { let player: AVAudioPlayerNode let engine:AVAudioEngine let audioSession:AVAudioSession let audioSessionOutput:AVAudioSession init() { self.player = AVAudioPlayerNode() self.engine = AVAudioEngine() self.audioSession = AVAudioSession.sharedInstance() self.audioSessionOutput = AVAudioSession() do { try self.audioSession.setCategory(AVAudioSession.Category.playAndRecord, options: [.defaultToSpeaker]) try self.audioSessionOutput.setCategory(AVAudioSession.Category.playAndRecord, options: [.allowBluetooth]) // enables Bluetooth HFP profile try self.audioSession.setMode(AVAudioSession.Mode.default) try self.audioSession.setActive(true) // try self.audioSession.overrideOutputAudioPort(.speaker) // doens't work } catch { print(error) } let input = self.engine.inputNode self.engine.attach(self.player) let bus = 0 let inputFormat = input.inputFormat(forBus: bus) self.engine.connect(self.player, to: engine.mainMixerNode, format: inputFormat) input.installTap(onBus: bus, bufferSize: 512, format: inputFormat) { (buffer, time) -> Void in self.player.scheduleBuffer(buffer) print(buffer) } } public func start() { try! self.engine.start() self.player.play() } public func stop() { self.player.stop() self.engine.stop() } } I am not sure if this is a bug or not. Can somebody point me into the right direction? I there a way to design a custom audio routing? I would also appreciate some good documentation besides AVFoundation docs.
0
0
326
Mar ’25
How to detect HDCP support in Safari.
I am playing FairPlay + Multi-Key content (fMP4) in Safari browser. I want to implement the implementation to distinguish between SD and HD video quality, and play it in HD if HDCP is supported, and in SD if HDCP is not supported. I have already confirmed that HDCP support is the default, and that a black screen is output in non-HDCP environments. What I want is to improve the user experience by appropriately switching to SD/HD depending on HDCP support when playing DRM content. Question: Is there an API or function that can detect HDCP support in Safari through JavaScript or other methods? Or is there a way to indirectly guess it?
0
0
186
Mar ’25