Dive into the technical aspects of audio on your device, including codecs, format support, and customization options.

Audio Documentation

Posts under Audio subtopic

Post

Replies

Boosts

Views

Activity

Listener for kAudioProcessPropertyIsRunningOutput
I'm trying to setup a listener for kAudioProcessPropertyIsRunningOutput but it's never triggered. I get calls for kAudioProcessPropertyIsRunning and kAudioProcessPropertyDevices but not for kAudioProcessPropertyIsRunningInput or kAudioProcessPropertyIsRunningOutput. class MyDelegate: PropertyListenerDelegate { func propertiesChanged(properties: [AudioObjectPropertyAddress]) { print(properties) } } var myDelegate = MyDelegate() var processes = try AudioHardwareSystem.shared.processes for process in processes { process.delegates += [myDelegate] try process.addListener(forProperties: [AudioObjectPropertyAddress(mSelector: kAudioPropertyWildcardPropertyID, mScope: kAudioObjectPropertyScopeWildcard, mElement: kAudioObjectPropertyElementWildcard)]) } Xcode 16.1 macOS 15.0.1
0
0
454
Dec ’24
MusicKit lastPlayedDate always nil
I am having trouble accessing the lastPlayedData for any given album or track using MusicKit. The value is always nil, both on numerous albums and tracks I tested. Afaik this is not a property that has to be fetched separately like tracks for example. I am running this on my physical iPhone 12 18.1.1 with Xcode 16.1. The albums and tracks have definitely been played multiple times before. The app has permission to the library using MusicAuthorization.request() This post mentions the same problem but offers no solution. Thanks for any help
0
0
426
Dec ’24
PTTFramework w/ AVAudioSession
Hi all, I have spent a lot of time reading the tech note and watching the WDDC video that introduce the PTTFramework on iOS. I currently have a custom setup where I am using AVAudioEngine to schedule and play buffers that are being streamed through a call. I am looking to use the PTTFramework to allow a user to trigger this push to talk behavior from the lock screen and the various places with the system UI it provides. However I am unsure what the correct behavior is regarding the handling of the audio session. Right now I am using .playback when there is no active voice transmission so that devices such as AirPods can be in AD2P mode where applicable, and then transitioning to .playbackAndRecord category only when the mic input should become active. Following this change in my AVAudioEngine manager I am then manually activating and deactivating the audio session manually when the engine is either playing/recording or idle. In the documentation it states that you should not attempt to activate or deactivate your audio session directly, but allow the framework to handle it. Does that mean that I need to either call the request to transmit delegate function or set an active participant on the channel manager first, and then wait for the didBecomeActive delegate method to trigger before I actually attempt to play or record any audio? (I am using the fullDuplex mode currently.) I noticed that that delegate method will only trigger if the audio session wasn't active before doing one of the above (setting active participant, requesting transmit). Lastly, when using the PTTFramework it also mentions that we get support for PTT devices and I notice on the didBeginTransmittingFrom property we have a handsfreeButton case. Is there any documentation or resources for what is actually supported out of the box for this? I am currently working on handling a lot of the push to talk through bluetooth LE, and wanted to make sure there wasn't overlap with what the system provides. Thank you!
2
0
532
Dec ’24
AirPods Audio Sample Rate Issue on macOS Sequoia
I’m experiencing an unusual audio issue with AirPods on macOS Sequoia while developing VoIP applications like Zoom and FaceTime. When AirPods are connected, the other party’s voice sometimes sounds unnaturally stretched (approximately twice as long). This problem can be temporarily fixed by switching the sound output settings from AirPods to speakers and then back to AirPods. From our analysis, the issue appears to be related to the sample rate provided by AudioObjectGetPropertyData. Here’s what we’ve observed: When the issue occurs, the AudioStreamBasicDescription.sampleRate for AirPods is reported as 48000. Under normal conditions, it’s reported as 24000. It seems like the system is mistakenly returning a sample rate that doesn’t match the AirPods’ actual settings, perhaps defaulting to a system speaker value. Once the output setting is toggled, the correct sampleRate (24000) is retrieved. This discrepancy causes our application to transmit the audio stream at 48000, leading to the distorted playback. Has anyone encountered a similar issue or knows how to resolve it?
2
0
557
Dec ’24
Delay w/ new AudioTap API when system device is a BL device
I'm capturing audio from other applications on macOS to mix them with other sources in a real time streaming application. I noticed that audio data captured via the new tapping mechanism introduced in macOS 14.2 arrives delayed in my app, when the macOS system device is a Bluetooth headphone, e.g. Apple AirPods. Sometimes this delay is about 300-400 milliseconds, which makes it unusable for live streaming, because the audio is out of sync with the video and also audio captured from other devices. What is confusing to me, is that this also happens when my app does not even use that output device. Is this a known issue? Is there a way around this?
1
0
369
Dec ’24
Increased delay when AUGraph's output device is system output
I'm using an AUGraph to mix audio from different sources for a real time streaming application. Whenever the audio device used as the graph's output device is also the Mac's default output device, the measured latency increases by about 35 milliseconds for wired devices. Any idea why this is? Is there a way around this besides nagging the user to not the use the system output in our app?
0
0
284
Dec ’24
AudioServicesPlaySystemSound not playing through BluetoothA2DP device
Hello We have an application that play some sound via the system sound APIs from the AudioToolbox framework. AudioServicesCreateSystemSoundID(url as CFURL, &soundID) AudioServicesPlaySystemSoundWithCompletion(soundID) Our make sure that an active audio session is available before playing the system sound. But when the device is connected to a BluetoothA2DP device. The sound are played on through the device speaker and not through the bluetooth A2DP device. Our AudioSesison is configured with the following categories [.allowBluetooth, .defaultToSpeaker, .allowBluetoothA2DP] Sound played from the AVAudioPlayer are played on the allowBluetoothA2DP device with similar code. Is this a bug in the AudioToolbox framework?
2
0
504
Dec ’24
Swift iOS CallKit audio resource contention
I noticed the following behavior with CallKit when receiving a VolP push notification: When the app is in the foreground and a CallKit incoming call banner appears, pressing the answer button directly causes the speaker indicator in the CallKit interface to turn on. However, the audio is not actually activated (the iPhone's orange microphone indicator does not light up). In the same foreground scenario, if I expand the CallKit banner before answering the call, the speaker indicator does not turn on, but the orange microphone indicator does light up, and audio works as expected. When the app is in the background or not running, the incoming call banner works as expected: I can answer the call directly without expanding the banner, and the speaker does not turn on automatically. The orange microphone indicator lights up as it should. Why is there a difference in behavior between answering directly from the banner versus expanding it first when the app is in the foreground? Is there a way to ensure consistent audio activation behavior across these scenarios? I tried reconfiguring the audio when answering a call, but an error occurred during setActive, preventing the configuration from succeeding. let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setActive(false) try audioSession.setCategory(.playAndRecord, mode: .voiceChat, options: [.defaultToSpeaker]) try audioSession.setActive(true, options: []) } catch { print("Failed to activate audio session: \(error)") } action.fulfill() } Error Domain=NSOSStatusErrorDomain Code=561017449 "Session activation failed" UserInfo={NSLocalizedDescription=Session activation failed}
1
0
462
Dec ’24
Swift iOS CallKit audio resource contention
I noticed the following behavior with CallKit when receiving a VolP push notification: When the app is in the foreground and a CallKit incoming call banner appears, pressing the answer button directly causes the speaker indicator in the CallKit interface to turn on. However, the audio is not actually activated (the iPhone's orange microphone indicator does not light up). In the same foreground scenario, if I expand the CallKit banner before answering the call, the speaker indicator does not turn on, but the orange microphone indicator does light up, and audio works as expected. When the app is in the background or not running, the incoming call banner works as expected: I can answer the call directly without expanding the banner, and the speaker does not turn on automatically. The orange microphone indicator lights up as it should. Why is there a difference in behavior between answering directly from the banner versus expanding it first when the app is in the foreground? Is there a way to ensure consistent audio activation behavior across these scenarios?
0
0
361
Dec ’24
AVAudioSession's "availableInputs" not update in time
// Here addObserver for routeChangeNotification func testAudioRoute() { // My app is an VoIP app, so I need to set "playAndRecord" and "allowBluetooth" try? AVAudioSession.sharedInstance().setCategory(.playAndRecord, options: [.duckOthers, .allowBluetooth, .allowBluetoothA2DP]) NotificationCenter.default.addObserver(self, selector: #selector(currentRouteChanged(noti:)), name: AVAudioSession.routeChangeNotification, object: nil) } // Print the "availableInputs" once got a notification @objc func currentRouteChanged(noti: Notification) { let availableInputs = AVAudioSession.sharedInstance().availableInputs?.compactMap({ $0.portType }) ?? [] let currentRouteInputs = AVAudioSession.sharedInstance().currentRoute.inputs.compactMap({ $0.portType }) let currentRouteOutputs = AVAudioSession.sharedInstance().currentRoute.outputs.compactMap({ $0.portType }) print("willtest: \navailableInputs=\(availableInputs), \ncurrentRouteInputs=\(currentRouteInputs), \ncurrentRouteOutputs=\(currentRouteOutputs)") /* When BT (Airpods pro 2) CONNECTTED: it will print like below when notification comes, this is correct. ---------------------------------------------------------- willtest: availableInputs=[__C.AVAudioSessionPort(_rawValue: MicrophoneBuiltIn), __C.AVAudioSessionPort(_rawValue: BluetoothHFP)], currentRouteInputs=[], currentRouteOutputs=[__C.AVAudioSessionPort(_rawValue: BluetoothA2DPOutput)] ---------------------------------------------------------- When BT (Airpods pro 2) DISCONNECTTED: it will print like below when notification comes, this is wrong. ---------------------------------------------------------- availableInputs=[__C.AVAudioSessionPort(_rawValue: MicrophoneBuiltIn), __C.AVAudioSessionPort(_rawValue: BluetoothHFP)], currentRouteInputs=[], currentRouteOutputs=[__C.AVAudioSessionPort(_rawValue: Speaker)] */ } So my question here is: Why does the "availableInputs" still contain the "C.AVAudioSessionPort(_rawValue: BluetoothHFP)" item even though I have already disconnected the BT device? (Put AirPods in the case.) BTW, if I tap the "Manual" button once I disconnected the BT, it also prints the "wrong" value for "availableInputs", and it will become normal after about 3~4 seconds.
4
0
467
Dec ’24
No charts genres for some storefronts
Hello, This is about the Get Catalog Top Charts Genres endpoint : GET https://api.music.apple.com/v1/catalog/{storefront}/genres I noticed that for some storefronts, no genre is returned. You can try with the following storefront values : France (fr) Poland (pl) Kyrgyzstan (kg) Uzbekistan (uz) Turkmenistan (tm) Is that a bug or is it on purpose ? Thank you.
0
0
407
Dec ’24
aumi AUv3 with AvAudioEngine ConnectMIDI multiple
Hi! I am creating a aumi AUv3 extension and I am trying to achieve simultaneous connections to multiple other avaudionodes. I would like to know it is possible to route the midi to different outputs inside the render process in the AUv3. I am using connectMIDI(_:to:format:eventListBlock:) to connect the output of the AUv3 to multiple AvAudioNodes. However, when I send midi out of the AUv3, it gets sent to all the AudioNodes connected to it. I can't seem to find any documentation on how to route the midi only to one of the connected nodes. Is this possible?
3
0
566
Nov ’24
AVAudioEngineConfigurationChange Clearing AVPlayerNode
Hi all, I am working on an app where I have live prompts playing, in addition to a voice channel that sometimes becomes active. Right now I am using two different AVAudioSession Configurations so what we only switch to a mic enabled mode when we actually need input from the mic. These are defined below. When just using the device hardware, everything works as expected and the modes change and the playback continues as needed. However when using bluetooth devices such as AirPods where the switch from AD2P to HFP is needed, I am getting a AVAudioEngineConfigurationChange notification. In response I am tearing down the engine and creating a new one with the same 2 player nodes. This does work fine and there are no crashes, except all the audio I have scheduled on a player node has now been cleared. All the completion blocks marked with ".dataPlayedBack" return the second this event happens, and leaves me in a state where I now have a valid engine setup again but have no idea what actually played, or was errantly marked as such. Is this the expected behavior when getting a configuration change notification? Adding some information below to my audio graph for context: All my parts of the graph, I disconnect when getting this event and do the same to the new engine private var inputEngine: AVAudioEngine private var audioEngine: AVAudioEngine private let voicePlayerNode: AVAudioPlayerNode private let promptPlayerNode: AVAudioPlayerNode audioEngine.attach(voicePlayerNode) audioEngine.attach(promptPlayerNode) audioEngine.connect( voicePlayerNode, to: audioEngine.mainMixerNode, format: voiceNodeFormat ) audioEngine.connect( promptPlayerNode, to: audioEngine.mainMixerNode, format: nil ) An example of how I am scheduling playback, and where that completion is firing even if it didn't actually play. private func scheduleVoicePlayback(_ id: AudioPlaybackSample.Id, buffer: AVAudioPCMBuffer) async throws { guard !voicePlayerQueue.samples.contains(where: { $0 == id }) else { return } seprateQueue.append(buffer) if !isVoicePlaying { activateAudioSession() } voicePlayerQueue.samples.append(id) if !voicePlayerNode.isPlaying { voicePlayerNode.play() } if let convertedBuffer = buffer.convert(to: voiceNodeFormat) { await voicePlayerNode.scheduleBuffer(convertedBuffer, completionCallbackType: .dataPlayedBack) } else { throw AudioPlaybackError.failedToConvert } voiceSampleHasBeenPlayed(id) } And lastly my audio session configuration if its useful. extension AVAudioSession { static func setDefaultCategory() { do { try sharedInstance().setCategory( .playback, options: [ .duckOthers, .interruptSpokenAudioAndMixWithOthers ] ) } catch { print("Failed to set default category? \(error.localizedDescription)") } } static func setVoiceChatCategory() { do { try sharedInstance().setCategory( .playAndRecord, options: [ .defaultToSpeaker, .allowBluetooth, .allowBluetoothA2DP, .duckOthers, .interruptSpokenAudioAndMixWithOthers ] ) } catch { print("Failed to set category? \(error.localizedDescription)") } } }
1
0
627
Nov ’24
mp3 audio plays on my device and simulator but some users have issue
Hi I just released an app which is live. i have a strange issue: while the audio files in the app play fine on my device, but some users are unable to hear. One friend said it played yesterday but not today. Any idea why? The files are mp3, I see them in Build Phase, and in the project obviously. Here's the audio view code, thank you! import AVFoundation struct MeditationView: View { @State private var player: AVAudioPlayer? @State private var isPlaying = false @State private var selectedMeditation: String? var isiPad = UIDevice.current.userInterfaceIdiom == .pad let columns = [GridItem(.flexible()),GridItem(.flexible())] let tracks = ["Intro":"intro.mp3", "Peace" : "mysoundbath1.mp3", "Serenity" : "mysoundbath2.mp3", "Relax" : "mysoundbath3.mp3"] var body: some View { VStack{ VStack{ VStack{ Image("dhvani").resizable().aspectRatio(contentMode: .fit) .frame(width: 120) Text("Enter the world of Dhvani soundbath sessions, click lotus icon to play.") .font(.custom("Times New Roman", size: 20)) .lineLimit(nil) .multilineTextAlignment(.leading) .fixedSize(horizontal: false, vertical: true) .italic() .foregroundStyle(Color.ashramGreen) .padding() } LazyVGrid(columns:columns, spacing:10){ ForEach(tracks.keys.sorted(),id:\.self){ track in Button { self.playMeditation(named: tracks[track]!) } label: { Image("lotus") .resizable() .frame(width: 40,height: 40) .background(Color.ashramGreen) .cornerRadius(10) } Text(track) .font(.custom("Times New Roman", size: 22)) .foregroundStyle(Color.ashramGreen) .italic() } } HStack(spacing:20) { Button(action: { self.togglePlayPause() }) { Image(systemName: isPlaying ? "playpause.fill" : "play.fill") .resizable() .frame(width: 20, height: 20) .foregroundColor(Color.ashramGreen) } Button(action: { self.stopMeditation() }) { Image(systemName: "stop.fill") .resizable() .frame(width: 20, height: 20) .foregroundColor(Color.ashramGreen) } } }.padding() .background(Color.ashramBeige) .cornerRadius(20) Spacer() //video play VStack{ Text("Chant") .font(.custom("Times New Roman", size: 24)) .foregroundStyle(Color.ashramGreen) .padding(5) WebView(urlString: "https://www.youtube.com/embed/ny3TqP9BxzE") .frame(height: isiPad ? 400 : 200) .cornerRadius(10) .padding() Text("Courtesy Sri Ramanasramam").font(.footnote).italic() } }.background(Color.ashramBeige) } //View func playMeditation(named name: String) { if let url = Bundle.main.url(forResource: name, withExtension: nil) { do { player = try AVAudioPlayer(contentsOf: url) player?.play() isPlaying = true } catch { print("Error playing meditation") } } } func togglePlayPause() { if let player = player { if player.isPlaying { player.pause() isPlaying = false } else { player.play() isPlaying = true } } } func stopMeditation() { player?.stop() isPlaying = false } } #Preview { MeditationView() }
1
0
313
Nov ’24
Lightning headphone adapter modes
I'm developing an app that plays a WAV file through the Lightning headphone adapter. When i connect the adapter, a prompt appears asking whether to select "Headphones" or "Other Device" What does this setting actually do? I've noticed that it affects the maximum amplitude (volume) of the WAV output. Could you explain the precise difference between these two modes?
0
0
219
Nov ’24
Failure of AudioUnitSetProperty when using MacCatalyst (works on macOS)
I was trying to set custom audio output device for a generated audio on macCatalyst. While using let status = AudioUnitSetProperty(outputUnit, kAudioOutputUnitProperty_CurrentDevice, kAudioUnitScope_Global, 0, &outputDeviceID, UInt32(MemoryLayout.size)) kAudioOutputUnitProperty_CurrentDevice is invalid, and status = -10879, indicating an error. STEPS TO REPRODUCE Set Run Destination to MacOS and run the program. "AudioUnitSetProperty: 0" should be printed, indicating it works fine. Set Run Destination to Mac Catalyst and run the program. "Error setting output device: -10879" should be printed, indicating an error.
4
1
615
Nov ’24
Any API for AirPods Pro 2
Hi, May I ask if there is any iOS API or similar way to switch between the transparency and ANC modes of AirPods Pro 2? I know there is one way to configure and activate the shortcut in the APP, which requires an inconvenient manual setting. May I ask for any other advice? Thx in advance!
0
0
227
Nov ’24
API to switch the mode of Airpods Pro 2
Hi, May I ask if there is any API or similar way inside the iOS app to set up/switch the transparency and ANC modes of the AirPods Pro 2? One way is to set up one shortcut and activate that shortcut in the app, but it requires manually setting for a shortcut, which is not convenient. Thx for any advice on that!
0
1
194
Nov ’24
Microphone access from control center
Title: Unable to Access Microphone in Control Center Widget – Is It Possible? Hello everyone, I'm attempting to create a widget in the Control Center that accesses the microphone, similar to how Shazam does it. However, I'm running into an issue where the widget always prints "Microphone permission denied." It's worth mentioning that microphone access works fine when I'm using the app itself. Here's the code I'm using in the widget: swift Copy code func startRecording() async { logger.info("Starting recording...") print("Starting recording...") recognizedText = "" isFinishingRecognition = false // First, check speech recognition authorization let speechAuthStatus = await withCheckedContinuation { continuation in SFSpeechRecognizer.requestAuthorization { status in continuation.resume(returning: status) } } guard speechAuthStatus == .authorized else { logger.error("Speech recognition not authorized") return } // Then, request microphone permission using our manager let micPermission = await AudioSessionManager.shared.requestMicrophonePermission() guard micPermission else { logger.error("Microphone permission denied") print("Microphone permission denied") return } // Continue with recording... } Issues: The code consistently prints "Microphone permission denied" when run from the widget. Microphone access works without issues when the same code is executed from within the app. Questions: Is it possible for a Control Center widget to access the microphone? If yes, what might be causing the "Microphone permission denied" error in the widget? Are there additional permissions or configurations required to enable microphone access in a widget? Any insights or suggestions would be greatly appreciated! Thank you.
0
0
463
Nov ’24