Hi, i have try with RPBroadcastSampleHandler Broadcast Extension and RPSystemBroadcastPickerView but i don't understand how can i mirror my iOS screen to Android smart TV?
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying:
"Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}}))
It doesn't matter if songs are downloaded to the device or not.
I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me.
I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same.
typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry
let player = ApplicationMusicPlayer.shared
let entries: [QueueEntry] = tracks
.compactMap {
guard let song = $0 as? Song else { return nil }
return QueueEntry(song)
}
Task(priority: .high) { [player] in
do {
player.queue = .init(entries, startingAt: nil)
try await player.play() // prepareToPlay failed
} catch {
print(error)
}
}
In SwiftUI there is a built-in component for displaying album artworks called Artwork but there is no equivalent for UIKit.
My current approach is to use the .url() method to read image's URL and download the image or read it from the disk but the performance is much worse than it was previously with MPMediaItem's artworkImage method.
let artworkQueue = DispatchQueue(
label: "MusicKit-ArtworkQueue",
qos: .default,
attributes: .concurrent
)
let artworkSemaphore = DispatchSemaphore(value: 5)
extension Song {
func artworkImage(for size: CGSize, completion: @escaping (UIImage?) -> Void) {
artworkQueue.async {
artworkSemaphore.wait()
defer {
artworkSemaphore.signal()
}
let imageURL = artwork?.url(
width: Int(size.width),
height: Int(size.height)
)
// I hate doing this as it might very well break in the future
guard let imageURL, imageURL.scheme == "musicKit"
else {
return completion(nil)
}
guard let imageData = try? Data(contentsOf: imageURL),
let image = UIImage(data: imageData) else {
return completion(nil)
}
completion(image)
}
}
}
I really dislike this approach because it feels hacky but somewhat works. You might ask what's the semaphore for? Well, without it I could notice that MusicKit was choking and after reading too many artworks at once.
Can someone from Apple please provide us with an example on how to use MusicKit with UIKit properly?
Ideally (IMO) we would have a method defined on Song and other MusicKit structures that returns the image for us, just like MPMediaItem had the .artwork() method. It would make our lives so much easier.
Hi, is there a way to display the animated cover art in an app? Not every album has a cover art but for the ones that have one I would like to display them instead of the artwork.
Thank you :)
I often find when doing basic actions in MusicKit it is incredibly slow compared to Apple's Music App. I've tried different versions, devices, networks, Apple's sample code, it all throughout the last several years, and it is all the same. Does anyone else have this issue?
Hello, we are embedding a PHPickerViewController with UIKit (adding the vc as a child vc, embedding the view, calling didMoveToParent) in our app using the compact mode. We are disabling the following capabilities .collectionNavigation, .selectionActions, .search.
One of our users using iOS 17.2.1 and iPhone 12 encountered a crash with the following stacktrace:
Crashed: com.apple.main-thread
0 libsystem_kernel.dylib 0x9fbc __pthread_kill + 8
1 libsystem_pthread.dylib 0x5680 pthread_kill + 268
2 libsystem_c.dylib 0x75b90 abort + 180
3 PhotoFoundation 0x33b0 -[PFAssertionPolicyCrashReport notifyAssertion:] + 66
4 PhotoFoundation 0x3198 -[PFAssertionPolicyComposite notifyAssertion:] + 160
5 PhotoFoundation 0x374c -[PFAssertionPolicyUnique notifyAssertion:] + 176
6 PhotoFoundation 0x2924 -[PFAssertionHandler handleFailureInFunction:file:lineNumber:description:arguments:] + 140
7 PhotoFoundation 0x3da4 _PFAssertFailHandler + 148
8 PhotosUI 0x22050 -[PHPickerViewController _handleRemoteViewControllerConnection:extension:extensionRequestIdentifier:error:completionHandler:] + 1356
9 PhotosUI 0x22b74 __66-[PHPickerViewController _setupExtension:error:completionHandler:]_block_invoke_3 + 52
10 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32
11 libdispatch.dylib 0x4300 _dispatch_client_callout + 20
12 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984
13 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44
14 CoreFoundation 0x3701c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16
15 CoreFoundation 0x33d28 __CFRunLoopRun + 1996
16 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608
17 GraphicsServices 0x34f8 GSEventRunModal + 164
18 UIKitCore 0x22c62c -[UIApplication _run] + 888
19 UIKitCore 0x22bc68 UIApplicationMain + 340
20 WorkAngel 0x8060 main + 20 (main.m:20)
21 ??? 0x1bd62adcc (Missing)
Please share if you have any ideas as to what might have caused that, or what to look at in such a case. I haven't been able to reproduce this myself unfortunately.
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
albums = albumCollections
}
for album in albums {
let artwork = album.representativeItem?.artwork
print(artwork, artwork?.image(at: CGSize(width: 100, height: 100)))
}
Any help would be greatly appreciated. Thanks!
I'm using iCloud Music Library. I’m using macOS 14.1 (23B74) and iOS 17.1.
i’m using MusicKit to find songs that do not have artwork. On iOS, Song.artwork will be nil for items I know do not have artwork. On macOS, Song.artwork is not nil. However when the songs are shown in Music.app, they do not have Artwork. Is this expected? Alternately, is there a more correct way to determine that a Song has no Artwork?
I have also filed FB13315721.
Thank you for any tips!
I'm building a UIKit app that reads user's Apple Music library and displays it. In MusicKit there is the Artwork structure which I need to use to display artwork images in the app. Since I'm not using SwiftUI I cannot use the ArtworkImage view that is recommended way of displaying those images but the Artwork structure has a method that returns url for the image which can be used to read the image.
The way I have it setup is really simple:
extension MusicKit.Song {
func imageURL(for cgSize: CGSize) -> URL? {
return artwork?.url(
width: Int(cgSize.width),
height: Int(cgSize.height)
)
}
func localImage(for cgSize: CGSize) -> UIImage? {
guard let url = imageURL(for: cgSize),
url.scheme == "musicKit",
let data = try? Data(contentsOf: url) else {
return nil
}
return .init(data: data)
}
}
Now, everytime I access .artwork property (so a lot of times) the main thread gets blocked and the console output gets bombared with messages like these:
2023-07-26 11:49:47.317195+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: <MPMediaLibraryArtwork: 0x289591590> with error; Error Domain=NSCocoaErrorDomain Code=4099 "The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction." UserInfo={NSDebugDescription=The connection to service named com.apple.mediaartworkd.xpc was invalidated: failed at lookup with error 159 - Sandbox restriction.}
2023-07-26 11:49:47.317262+0200 Plum[998:297199] [Artwork] Failed to create color analysis for artwork: file:///var/mobile/Media/iTunes_Control/iTunes/Artwork/Originals/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7
2023-07-26 11:49:47.323099+0200 Plum[998:297013] [Plum] IIOImageWriteSession:121: cannot create: '/var/mobile/Media/iTunes_Control/iTunes/Artwork/Caches/320x320/4b/48d7b8d349d2de858413ae4561b6ba1b294dc7.sb-f9c7943d-6ciLNp'error = 1 (Operation not permitted)
My guess is that the most performance-heavy task here is performing the color analysis for each artwork but IMO the property backgroundColor should not be a stored property if that's the case. I am not planning to use it anywhere and if so it should be a computed async property so it doesn't block the caller.
I know I can move the call to a background thread and that fixes the issue of blocking main thread but still the loading times for each artwork are terribly slow and that impacts the UX.
SwiftUI's ArtworkImage loads the artworks much quicker and without the errors so there must be a better way to do it.
ApplicationMusicPlayer is available on the Mac! 🎉🎉🎉 Enormous thanks to @JoeKun and the team. I've already gotten my app up and running through Catalyst, and I've successfully played music! I also got some timeouts, but that was happening on my phone a lot that day too, so maybe my local CDN was just having a bad day.
I wanted to ask this question in a lab this week, but the timing didn't work out: Do you expect the experience to be the same using ApplicationMusicPlayer on a Catalyst vs a macOS target? I'm hoping to reuse much of my iPad app and go the Catalyst route, but I wanted to double check that the new support wasn't just for macOS.
I am calling AVSampleBufferDisplayLayer.flush from a background queue but this seems to occasionally crash the app. I am calling it from the same thread I pass to - (void)requestMediaDataWhenReadyOnQueue:(dispatch_queue_t)queue usingBlock:(void (^)(void))block;.
My question is, is this API threadsafe, or do I need to call flush from the main thread? Or is there another issue that I am not considering? It seems strange to me that this API would trigger an autolayout pass.
0 CoreFoundation 0x00000001bb384e38 __exceptionPreprocess + 164
1 libobjc.A.dylib 0x00000001b451b8d8 objc_exception_throw + 59
2 CoreAutoLayout 0x00000001d7e09e84 _AssertAutoLayoutOnAllowedThreadsOnly + 327
3 CoreAutoLayout 0x00000001d7e00e60 -[NSISEngine withBehaviors:performModifications:] + 35
4 UIKitCore 0x00000001be58fd40 -[UIView _postMovedFromSuperview:] + 671
5 UIKitCore 0x00000001bd56dfec -[UIView(Internal) _addSubview:positioned:relativeTo:] + 1903
6 UIKitCore 0x00000001bda57ccc -[_UITextLayoutCanvasView textViewportLayoutController:configureRenderingSurfaceForTextLayoutFragment:] + 455
7 UIFoundation 0x00000001c588bc9c __48-[NSTextViewportLayoutController layoutViewport]_block_invoke_4 + 151
8 UIFoundation 0x00000001c5836b50 __80-[NSTextLayoutManager enumerateViewportElementsFromLocation:options:usingBlock:]_block_invoke + 43
9 UIFoundation 0x00000001c580e158 __83-[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:]_block_invoke_2 + 535
10 CoreFoundation 0x00000001bb385350 __NSARRAY_IS_CALLING_OUT_TO_A_BLOCK__ + 23
11 CoreFoundation 0x00000001bb3b24dc -[__NSSingleObjectArrayI enumerateObjectsWithOptions:usingBlock:] + 91
12 UIFoundation 0x00000001c580de28 __83-[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:]_block_invoke + 775
13 UIFoundation 0x00000001c57f7504 -[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:] + 659
14 UIFoundation 0x00000001c57f7264 -[NSTextLayoutManager enumerateViewportElementsFromLocation:options:usingBlock:] + 99
15 UIFoundation 0x00000001c57f6d7c -[NSTextViewportLayoutController layoutViewport] + 1299
16 UIKitCore 0x00000001bd580a3c +[UIView(Animation) performWithoutAnimation:] + 75
17 UIKitCore 0x00000001bd5582d0 -[_UITextLayoutCanvasView layoutSubviews] + 139
18 UIKitCore 0x00000001bd5544c8 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 1979
19 QuartzCore 0x00000001bca277fc CA::Layer::layout_if_needed(CA::Transaction*) + 499
20 QuartzCore 0x00000001bca3aeb0 CA::Layer::layout_and_display_if_needed(CA::Transaction*) + 147
21 QuartzCore 0x00000001bca4c234 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 443
22 QuartzCore 0x00000001bca81630 CA::Transaction::commit() + 651
23 MediaToolbox 0x00000001ca8d0da0 videoQueueRemote_SetProperty + 367
24 AVFCore 0x00000001cad191b4 __63-[AVSampleBufferVideoRenderer _setContentLayerOnFigVideoQueue:]_block_invoke + 179
25 libdispatch.dylib 0x00000001c299cf88 _dispatch_client_callout + 19
26 libdispatch.dylib 0x00000001c29ac574 _dispatch_lane_barrier_sync_invoke_and_complete + 55
27 AVFCore 0x00000001cad190d0 -[AVSampleBufferVideoRenderer _setContentLayerOnFigVideoQueue:] + 167
28 AVFCore 0x00000001cad14674 -[AVSampleBufferVideoRenderer _createVideoQueue:errorStep:] + 195
29 AVFCore 0x00000001cad14ac8 -[AVSampleBufferVideoRenderer createVideoQueue:] + 55
30 AVFCore 0x00000001cad179cc -[AVSampleBufferVideoRenderer flushWithRemovalOfDisplayedImage:completionHandler:] + 439
31 App 0x0000000102370214 -[AVSampleBufferDisplayLayer flush] + 51
I have a custom app running on a Mac Studio with Ventura that grabs a snapshot image from a network camera. It then adds some extra information into the EXIF "MakerNote" field. However the metadata cannot be read back out of the image when running Ventrua, it can however be read out of the same image file on a Mac that is not running Ventura.
It would appear Apple has removed support for reading MakerNote in Ventura but still supports writing MakerNote in Ventura.
This code is about 7 years old and written in ObjC and has worked with no issue until Ventura came along.
Calls used
CGImageDestinationAddImageFromSource(); // used to write the image to disk with the extra metadata - Works on Ventura
CGImageSourceCopyPropertiesAtIndex(); // used to read the meta data from an image - does not return "MakeNote" data
Is there a new way to read EXIF "MakeNote" data from image files that was introduced with Ventura?
The two ScreenCaptureKit WWDC22 sessions show how to capture with the new framework but the retina factor is hardcoded to 2 in SCStreamConfiguration.
When using on a non-retina display, the screencapture is floating on the upper-left corner of the image buffer.
There does not seem to be a simple way to retrieve the retina factor from the SCShareableContent data (when configuring the capture).
When processing the streaming output, the SCStreamFrameInfo attachment is supposed to have a scaleFactor property but .scaleFactor does not return a value.
I have found out that the attachement dictionary contains SCStreamUpdateFrameDisplayResolution. This entry gives me the retina factor but it is not an official SCStreamFrameInfo key. I list the keys to access it.
What is the proper way with ScreenCapture to handle the retina factors ?
Hi all,
Apple dropping on-going development for FireWire devices that were supported with the Core Audio driver standard is a catastrophe for a lot of struggling musicians who need to both keep up to date on security updates that come with new OS releases, and continue to utilise their hard earned investments in very expensive and still pristine audio devices that have been reduced to e-waste by Apple's seemingly tone-deaf ignorance in the cries for on-going support.
I have one of said audio devices, and I'd like to keep using it while keeping my 2019 Intel Mac Book Pro up to date with the latest security updates and OS features.
Probably not the first time you gurus have had someone make the logical leap leading to a request for something like this, but I was wondering if it might be somehow possible of shoe-horning the code used in previous versions of Mac OS that allowed the Mac to speak with the audio features of such devices to run inside the Ventura version of the OS.
Would it possible? Would it involve a lot of work? I don't think I'd be the only person willing to pay for a third party application or utility that restored this functionality.
There has to be 100's of thousands of people who would be happy to spare some cash to stop their multi-thousand dollar investment in gear to be so thoughtlessly resigned to the scrap heap.
Any comments or layman-friendly explanations as to why this couldn’t happen would be gratefully received!
Thanks,
em
I'm creating app that listening other app's sound. in this use case, screen data is not needed.
but if I don't call SCStream#addStreamOutput(_, type: .screen, ...), console shows this error:
[ERROR] _SCStream_RemoteVideoQueueOperationHandlerWithError:701 stream output NOT found. Dropping frame
currently I'm setting SCStreamConfiguration#minimumFrameInterval to large value (e.g. 0.1fps) as workaround, but it would be good if i can completely disable screen capture for best performance.
there is any way to disable screen capture and only captures apps audio?
Hello!
I am having trouble setting start times for songs when using the ApplicationMusicPlayer.
When I initialize a new MusicPlayer.Queue.Entry using the following constructor, I am seeing strange results:
init(
_ playableMusicItem: PlayableMusicItem,
startTime: TimeInterval? = nil,
endTime: TimeInterval? = nil
)
It appears that any value I provide for startTime is also applied to the endTime. For example:
MusicPlayer.Queue.Entry(playable, startTime: TimeInterval(30), endTime: TimeInterval(183))
provides the following console output:
MusicPlayer.Queue.Entry(id: "3D6A3DA3-595E-4657-8DBA-DDD245BBB7EF", transientItem: PlayableMusicItem, startTime: 30.0, endTime: 30.0)
I have also tried setting the endTime to nil with the same result. Does anyone have any experience setting start times for songs using the MusicKit ApplicationMusicPlayer?
Any feedback is greatly appreciated!
Hello,
I'm new to the Swift MusicKit API and am starting with the implementation in iOS 16.
I'm getting stuck on an issue where there is no background or text color associated with the Artwork object. Is this something you have to make an additional property request for, and if so, how do you do that?
var catalogSearch = MusicCatalogResourceRequest<Album>(matching: \.id, equalTo: item.id)
let catalogResponse = try await request.response()
guard let firstItem = catalogResponse.items.first else {
return
}
In this example, firstItem.artwork only contains the url and what look like incorrect max width/height values.
here's a printout of firstItem.artwork
Optional(Artwork(
urlFormat: "musicKit://artwork/library/5F37858D-F46B-4F12-BA67-40FA8DD63D87/{w}x{h}?at=item&fat=&id=7718670444435992305&lid=5F37858D-F46B-4F12-BA67-40FA8DD63D87&mt=music&aat=Music122/v4/37/25/f5/3725f515-249f-7b91-77bb-f479cd48201c/22UMGIM32254.rgb.jpg",
maximumWidth: 0,
maximumHeight: 0
))
I've just begun to dip my toes into the iOS16 waters.
One of the first things that I've attempted is to edit a library playlist using:
try await MusicLibrary.shared.edit(targetPlaylist, items: tracksToAdd)
Where targetPlaylist is of type MusicItemCollection<MusicKit.Playlist>.Element and tracksToAdd is of type [Track]
The targetPlaylist was created, using new iOS16 way, here:
let newPlaylist = try await MusicLibrary.shared.createPlaylist(name: name, description: description)
tracksToAdd is derived by performing a MusicLibraryRequest on a specific playlist ID, and then doing something like this:
if let tracksToAdd = try await playlist.with(.tracks).tracks {
// add tracks to target playlist
}
My problem is that when I perform attempt the edit, I am faced with a rather sad looking crash.
libdispatch.dylib`dispatch_group_leave.cold.1:
0x10b43d62c <+0>: mov x8, #0x0
0x10b43d630 <+4>: stp x20, x21, [sp, #-0x10]!
0x10b43d634 <+8>: adrp x20, 6
0x10b43d638 <+12>: add x20, x20, #0xfbf ; "BUG IN CLIENT OF LIBDISPATCH: Unbalanced call to dispatch_group_leave()"
0x10b43d63c <+16>: adrp x21, 40
0x10b43d640 <+20>: add x21, x21, #0x260 ; gCRAnnotations
0x10b43d644 <+24>: str x20, [x21, #0x8]
0x10b43d648 <+28>: str x8, [x21, #0x38]
0x10b43d64c <+32>: ldp x20, x21, [sp], #0x10
-> 0x10b43d650 <+36>: brk #0x1
I assume that I must be doing something wrong, but I frankly have no idea how to troubleshoot this.
Any help would be most appreciated. Thanks. @david-apple?