Dear Sirs,
I've written an audio driver based on IOUserAudioDevice. In my IOOperationHandler I can receive and send the audio samples as expected. Is there any way to configure the number of samples transferred in each call? Currently it seem to be around 512 samples per call, which relates to 10.7 millisecs when operating on 48 kHz samplerate. I'd like to achieve something like 48 or 96 samples per call. I did some experiments and tried calls to SetOutputLatency() etc. but so far I didn't find the right way to change the in_io_buffer_frame_size in the callback. I'd like to do this as smaller buffer sizes would allow lower latencies for the subsequent audio processing.
Thanks and best regards,
Johannes
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I’m a tracking company and have my own tracking platform. Looking for the solution that using tag device for animals like Air Tags but running on my platform.
Is there a way to allow my platform to interface with the Find My Phone to get the location data of my Tags ?
I am implementing Siri/Shortcuts for radio app for iOS. I have implemented AppIntent that sends notification to app and app should start playing the stream in AVPlayer.
AppIntent sometimes works, sometimes it doesn't. So far I couldn't find the pattern when/why it works and when/why it doesn't. Sometimes it works even if app is killed or is in the background. Sometimes it doesn't work when the app is in the background and when it is killed.
I have been observing logs in Console and apparently sometimes it stops when AVPlayer tries to figure out buffer size (then I am getting in console AVPlayerWaitingToMinimizeStallsReason and the AVPlayerItem status is set to .unknown). Sometimes it figures out quickly (for the same stream) and starts playing.
Sometimes when the app is killed, after AppIntent call the app is launched in the background (at least I see it as a process in Console) and receives notification from AppIntent and start playing. Sometimes... the app is not called at all, and its process is not visible in the console, so it doesn't receives the notification and doesn't play.
I have setup Session correctly (set to .playback without any options and activated), I set AVPlayerItem's preferredForwardBufferDuration to 0 (default), and AVPlayer's automaticallyWaitsToMinimizeStalling to true.
Background processing, Audio, AirPlay, Picture in Picture and Siri are added in Singing & Capabilities section of the app project settings.
Here are the code examples:
Play AppIntent (Stop App Intent is constructed the same way):
@available(iOS 16, *)
struct PlayStationIntent: AudioPlaybackIntent {
static let title: LocalizedStringResource = "Start playing"
static let description = IntentDescription("Plays currently selected radio")
@MainActor
func perform() async throws -> some IntentResult {
NotificationCenter.default.post(name: IntentsNotifications.siriPlayCurrentStationNotificationName, object: nil)
return .result()
}
}
AppShortcutsProvider:
struct RadioTestShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
AppShortcut(
intent: PlayStationIntent(),
phrases: [
"Start station in \(.applicationName)",
],
shortTitle: LocalizedStringResource("Play station"),
systemImageName: "radio"
)
}
}
Player object:
class Player: ObservableObject {
private let session = AVAudioSession.sharedInstance()
private let streamURL = URL(string: "http://radio.rockserwis.fm/live")!
private var player: AVPlayer?
private var item: AVPlayerItem?
var cancellables = Set<AnyCancellable>()
typealias UInfo = [AnyHashable: Any]
@Published var status: Player.Status = .stopped
@Published var isPlaying = false
func setupSession() {
do {
try session.setCategory(.playback)
} catch {
print("*** Error setting up category audio session: \(error), \(error.localizedDescription)")
}
do {
try session.setActive(true)
} catch {
print("*** Error setting audio session active: \(error), \(error.localizedDescription)")
}
}
func setupPlayer() {
item = AVPlayerItem(url: streamURL)
item?.preferredForwardBufferDuration = TimeInterval(0)
player = AVPlayer(playerItem: item)
player?.automaticallyWaitsToMinimizeStalling = true
player?.allowsExternalPlayback = false
let metaDataOuptut = AVPlayerItemMetadataOutput(identifiers: nil)
}
func play() {
setupPlayer()
setupSession()
handleInterruption()
player?.play()
isPlaying = true
player?.currentItem?.publisher(for: \.status)
.receive(on: DispatchQueue.main)
.sink(receiveValue: { status in
self.handle(status: status)
})
.store(in: &self.cancellables)
}
func stop() {
player?.pause()
player = nil
isPlaying = false
status = .stopped
}
func handle(status: AVPlayerItem.Status) {
...
}
func handleInterruption() {
...
}
func handle(interruptionType: AVAudioSession.InterruptionType?, userInfo: UInfo?) {
...
}
}
extension Player {
enum Status {
case waiting, ready, failed, stopped
}
}
extension Player {
func setupRemoteTransportControls() {
...
}
}
Content view:
struct ContentView: View {
@EnvironmentObject var player: Player
var body: some View {
VStack(spacing: 20) {
Text("AppIntents Radio Test App")
.font(.title)
Button {
if player.isPlaying {
player.stop()
} else {
player.play()
}
} label: {
Image(systemName: player.isPlaying ? "pause.circle" : "play.circle")
.font(.system(size: 80))
}
}
.padding()
}
}
#Preview {
ContentView()
}
Main struct:
```import SwiftUI
@main
struct RadioTestApp: App {
let player = Player()
let siriPlayCurrentPub = NotificationCenter.default.publisher(for: IntentsNotifications.siriPlayCurrentStationNotificationName)
let siriStop = NotificationCenter.default.publisher(for: IntentsNotifications.siriStopRadioNotificationName)
var body: some Scene {
WindowGroup {
ContentView()
.environmentObject(player)
.onReceive(siriPlayCurrentPub, perform: { _ in
player.play()
})
.onReceive(siriStop, perform: { _ in
player.stop()
})
}
}
}
Topic:
Media Technologies
SubTopic:
Streaming
MPMusicPlayerControllers nowPlayingItem no longer seems to be able to change a song. The code use to work but seems to be broken on iOS 16, 17 and now the iOS 18 beta.
When newSong is triggered, the song restarts but it does not change songs. Instead I get the following error: Failed to set now playing item error=<MPMusicPlayerControllerErrorDomain.5 "Unable to play item <MPConcreteMediaItem: 0x9e9f0ef70> 206357861099970620" {}>.
The documentation seems to indicate I’m doing things correctly.
class MusicPlayer {
var songTwo: MPMediaItem?
let player = MPMusicPlayerController.applicationMusicPlayer
func start() async {
await MPMediaLibrary.requestAuthorization()
let myPlaylistsQuery = MPMediaQuery.playlists()
let playlists = myPlaylistsQuery.collections!.filter { $0.items.count > 2}
let playlist = playlists.first!
let songOne = playlist.items.first!
songTwo = playlist.items[1]
player.setQueue(with: playlist)
play(songOne)
}
func newSong() {
guard let songTwo else { return }
play(songTwo)
}
private func play(_ song: MPMediaItem) {
player.stop()
player.nowPlayingItem = song
player.prepareToPlay()
player.play()
}
}
I have a user who is reporting an error and has been kind enough to share screen recordings to help diagnose. I am not experiencing this error, nor am I able to replicate on other devices I've tried, so I'm stuck trying to fix. His & other devices tested were all running iOS 17.5.1. Any details on the cause of this error or potential workarounds I could use to resolve would be greatly appreciated.
try await ApplicationMusicPlayer.shared.play()
throws:
The operation couldn't be completed (MPMusicPlayerControllerErrorDomain error 6.)
MusicAuthorization.currentStatus is .authorized
ApplicationMusicPlayer.shared.isPreparedToPlay is false
ApplicationMusicPlayer.shared.queue.currentEntry is nil (I've noticed this to be the case even when I am able to successfully play as well)
Queue was loaded using ApplicationMusicPlayer.shared.queue = [album] but I also tried ApplicationMusicPlayer.shared.queue = ApplicationMusicPlayer.Queue(album:startingAt:) and it made no difference. album.playParameters are correct. He experiences the error when attempting to play any album.
Any and all help is truly appreciated. Feedback Assistant filed has gone unanswered.
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error.
To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro.
The relevant part of the m3u8 is:
#EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO"
{{url}}
Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
It seems that there’s still no way to get all TIFF tags from a TIFF image, is that right? I've got these GeoTIFF images that have a handful of specialized TIFF tags in them. Calling CGImageSourceCopyPropertiesAtIndex(), I can see basic properties common to all TIFF images, like dimensions and color/pixel information, but no others.
Short of including libtiff, is there another way to get at the metadata? I've tried all of the options in CGImageSourceCopyAuxiliaryDataInfoAtIndex.
I've written a few bugs about this since 2020, all ignored.
I'm trying to download M3U8 media on watchOS by this code:
let configuration = URLSessionConfiguration.background(withIdentifier: "com.id")
let session = AVAssetDownloadURLSession(
configuration: configuration,
assetDownloadDelegate: M3U8DownloadDelegate.shared,
delegateQueue: .main
)
let asset = AVURLAsset(url: URL(string: mediaLink)!)
let downloadTask = session.makeAssetDownloadTask(downloadConfiguration: .init(asset: asset, title: ""))
downloadTask.resume()
m3u8DownloadObservation = downloadTask.progress.observe(\.fractionCompleted) { progress, _ in
print(progress)
}
But downloadTask.progress is always zero, and the observation is never called.
How to get the progress correctly?
Some iOS apps with signatures or bundle IDs will receive the AVAudioSessionInterruptionTypeBegan callback when the headphones are disconnected, but will not receive the AVAudioSessionInterruptionTypeEnded callback. Not all bundle IDs can cause appeal issues,
May I ask why different bundle IDs result in the above differences, and what are the settings that bind bundle IDs that affect the notification of AVAudioSessionInterruptionType
Hi, I'm developing a simple app to visualize embedded VR180 3D video.
I used a semisphere and projected the video as its material. The semisphere is in the ambient at a fixed y value of 1.35, which is good for a seated person, but not ideal for a standing person because the stereoscopic vision is not correct. In the AppleTV+ and Kandao applications, I noticed that the translation of the video is anchored to the Apple Vision Pro. I tried using AnchorEntity to the head with trackingMode .once, but there is the problem of rotation; the semisphere starts with the rotation of the head.
Is there a solution, for example, to anchor the semisphere only to the translation and not to the rotation of the head?
Hi!
I am getting AVErrorMediaServicesWereReset (-11819) thrown as an error by AVMutableCompositionTrack.insertTimeRange(_:of:at:) when trying to insert part of an AVAssetTrack into my video track (an AVMutableCompositionTrack) in my AVMutableComposition. This is not happening every time I'm making an AVMutableComposition, but it is happening frequently.
I am also getting this error thrown when trying to export using an AVAssetExportSession.
Is there any insight into what can cause this error in these scenarios?
Thanks!
Hi there community,
First and foremost, a big thank you to everyone who takes the time to read this.
TL;DR: How, if even possible, can I record multiple audio streams simultaneously on an iOS application (iPad/iPhone)?
I'm working on a recorder for the iPad to gather data for a machine learning project focused on speech recognition. Our goal is to capture extensive speech data, which requires recording from multiple microphones. Specifically, I need to record from all mics connected to our Scarlett 4i4 audio interface and, most importantly, also record from the built-in mic on the iPad or iPhone at the same time.
As a newcomer to Swift development, I initially explored AVAudioRecorder. However, I quickly realized that it only supports one active audio node at a time, making multi-channel recording impossible. (perhaps you can proof me wrong, would make my day) Next, I transitioned to using AVAudioEngine, but encountered the same limitation: I couldn't manage to get input nodes for both the built-in mic and the Scarlett interface channels simultaneously. The application started behaving oddly, often resulting in identical audio data being recorded across all files.
Determined to find a solution, I delved deeper into the Core Audio framework, specifically using Audio Toolbox. My approach involved creating and configuring multiple Audio Units, each corresponding to a different audio input device. Here's a brief overview of my current implementation:
Listing Available Input Devices: I used AVAudioSession to enumerate all available input devices.
Creating Audio Units: For each device, I created an Audio Unit and attempted to configure it for recording.
Setting Up Callbacks: I set up input and output callbacks to handle the audio processing.
Despite my efforts over the last few days, I haven't had much success. The callbacks for the Audio Units don't seem to be invoked correctly, and I'm struggling to achieve simultaneous multi-channel recording. Below is a snippet of my latest attempt:
let audioUnitCallback: AURenderCallback = { (
inRefCon: UnsafeMutableRawPointer,
ioActionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>,
inTimeStamp: UnsafePointer<AudioTimeStamp>,
inBusNumber: UInt32,
inNumberFrames: UInt32,
ioData: UnsafeMutablePointer<AudioBufferList>?
) -> OSStatus in
guard let ioData = ioData else {
return noErr
}
print("Input callback invoked")
let audioUnit = inRefCon.assumingMemoryBound(to: AudioUnit.self).pointee
var bufferList = AudioBufferList(
mNumberBuffers: 1,
mBuffers: AudioBuffer(
mNumberChannels: 1,
mDataByteSize: 0,
mData: nil
)
)
let status = AudioUnitRender(audioUnit, ioActionFlags, inTimeStamp, inBusNumber, inNumberFrames, &bufferList)
if status != noErr {
print("AudioUnitRender failed: \(status)")
return status
}
// Copy rendered data to output buffer
let buffer = UnsafeMutableAudioBufferListPointer(ioData)[0]
buffer.mData?.copyMemory(from: bufferList.mBuffers.mData!, byteCount: Int(bufferList.mBuffers.mDataByteSize))
buffer.mDataByteSize = bufferList.mBuffers.mDataByteSize
print("Rendered audio data")
return noErr
}
let outputCallback: AURenderCallback = { (
inRefCon: UnsafeMutableRawPointer,
ioActionFlags: UnsafeMutablePointer<AudioUnitRenderActionFlags>,
inTimeStamp: UnsafePointer<AudioTimeStamp>,
inBusNumber: UInt32,
inNumberFrames: UInt32,
ioData: UnsafeMutablePointer<AudioBufferList>?
) -> OSStatus in
guard let ioData = ioData else {
return noErr
}
print("Output callback invoked")
// Process the output data if needed
return noErr
}
In essence, I'm stuck and in need of guidance. Has anyone here successfully implemented multi-channel recording on iOS, especially involving both built-in microphones and external audio interfaces? Any shared experiences, insights, or suggestions on how to proceed would be immensely appreciated.
Thank you once again for your time and assistance!
Topic:
Media Technologies
SubTopic:
Audio
Tags:
AudioToolbox
AVAudioNode
AVAudioSession
AVAudioEngine
I’m having issues with volume after installing ios 18 beta 4. The volume toggle in control centre is turned up to full volume and is disabled. It only gets enabled after playing music. As soon as I pause music it gets disabled and my iPhone turns mute. Also having issue when playing music on AirPods; as soon as I turn my screen on, the music pauses. It happens every time o do this action.
Topic:
Media Technologies
SubTopic:
Audio
I tried to play music on my iPhone and it keeps skipping over all of the songs and not playing any music.
Topic:
Media Technologies
SubTopic:
Audio
I'm building an app that will allow users to record voice notes. The functionality of all that is working great; I'm trying to now implement changes to the AudioSession to manage possible audio streams from other apps. I want it so that if there is audio playing from a different app, and the user opens my app; the audio keep playing. When we start recording, any third party app audio should stop, and can then can resume again when we stop recording.
This is my main audio setup code:
private var audioEngine: AVAudioEngine!
private var inputNode: AVAudioInputNode!
func setupAudioEngine() {
audioEngine = AVAudioEngine()
inputNode = audioEngine.inputNode
audioPlayerNode = AVAudioPlayerNode()
audioEngine.attach(audioPlayerNode)
let format = AVAudioFormat(standardFormatWithSampleRate: AUDIO_SESSION_SAMPLE_RATE, channels: 1)
audioEngine.connect(audioPlayerNode, to: audioEngine.mainMixerNode, format: format)
}
private func setupAudioSession() {
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
try audioSession.setPreferredSampleRate(AUDIO_SESSION_SAMPLE_RATE)
try audioSession.setPreferredIOBufferDuration(0.005) // 5ms buffer for lower latency
try audioSession.setActive(true)
// Add observers
setupInterruptionObserver()
} catch {
audioErrorMessage = "Failed to set up audio session: \(error)"
}
}
This is all called upon app startup so we're ready to record whenever the user presses the record button.
However, currently when this happens, any outside audio stops playing.
I isolated the issue to this line: inputNode = audioEngine.inputNode
When that's commented out, the audio will play -- but I obviously need this for recording functionality.
Is this a bug? Expected behavior?
Hello !
I am working on an app connected to an external streamer .
I would like to display current playing song on the Lock Screen.
I tried to update the information in MPNowPlayingInfoCenter but I need to play a sound on my iPhone for the control to be displayed .
Is there a way to do it without playing a sound?
If not, playing a silent sound would be the only solution ? validated by Apple ? :-/
Thank you
Frederic
Hello pals,
I investigated strange bug with video url
and found out that
on iOS 18
method PHCachingImageManager().requestAVAsset(forVideo:
returns very weird asset.url with strange suffix
"someFileName.MOV#YnBsaXN0MDDRAQJfEBtSZWNvbW1lbmRlZEZvckltbWVyc2l2ZU1vZGUQAAgLKQAAAAAAAAEBAAAAAAAAAAMAAAAAAAAAAAAAAAAAAAAr"
example:
PHCachingImageManager().requestAVAsset(forVideo: asset, options: options) { asset, _, _ in
if let asset = asset as? AVURLAsset {
print(asset.url)
// prints - file:///.../data/Media/DCIM/100APPLE/IMG_0011.MOV#YnBsaXN0MDDRAQJfEBtSZWNvbW1lbmRlZEZvckltbWVyc2l2ZU1vZGUQAAgLKQAAAAAAAAEBAAAAAAAAAAMAAAAAAAAAAAAAAAAAAAAr
}
}
on iOS below 18 - it return regular url "...someFile.MOV"
how to correct this bug for iOS 18 users?
Please suggest me something, or maybe I'm using this method incorrectly?
Topic:
Media Technologies
SubTopic:
Photos & Camera
I would really like to know if it is possible to get a state for the ringer on iOS. In my application I want to repeat the logic of audio rules from Instagram during watching a video.
The video plays with sound. When the ringer switches to mute status the video's audio should be muted too. When you press the volume up or down button the audio should be unmuted.
I found how to catch volume up/down buttons with AVAudioSession.sharedInstance().observe(\.outputVolume) but I couldn't find anything that could help me with the ringer state. AVAudioSession.Category can't achieve this effect.
Also there is a possibility to check ringer state with Darwin notify lib like
var token = NOTIFY_TOKEN_INVALID
notify_register_dispatch(
"com.apple.springboard.ringerstate",
&token,
.main
) { token in
var state: UInt64 = 0
notify_get_state(token, &state)
print("Changed to", state == 1 ? "ON" : "OFF")
}
but I'm not sure that this won't lead to the application being rejected. I don't know is it a private API usage or not.
I will be glad to any advice and suggestions. Thanks
Is it possible to put an video on loop and autoplaying for visionOS? We used AVPlayerViewController
Hi,
I’m a photographer and recently after I upgrade to ios 18 beta, noticed that when I airdrop my photos (which I took with medium format cameras with extremely high quality) from my macbook pro to my iphone, they lose quality alot!! The sharpness is completely gone, and details are very less. Does anyone know how to solve this?!
Topic:
Media Technologies
SubTopic:
Photos & Camera