The list of certificates on the Apple Developer web console shows the expiry of my Fairplay Streaming certificate as 'Never'.
However, if I download the same certificate and import it into my KeyChain, the certificate details show the listed expiry as 11 OCT 2023.
Which of these is correct? If the expiry in the certificate is correct, how do I renew it safely.
With my App the below lines fails at the process of -streamingContentKeyRequestData-
CODE
guard
let contentIdData = (loadingRequest.request.url?.host ?? "").data(using: .utf8),
let spcData = try? loadingRequest.streamingContentKeyRequestData(
forApp: certificate!, // This certificate is expired
contentIdentifier: contentIdData,
options: nil
)
else {
print("Error: Failed to generate SPC data due to expired certificate.")
loadingRequest.finishLoading(with: NSError(domain: "com.example.error", code: -3, userInfo: nil))
return false
}
Streaming
RSS for tagDeep dive into the technical specifications that influence seamless playback for streaming services, including bitrates, codecs, and caching mechanisms.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi
I'm trying to stream a H264 video feed that is coming from a uniview IP camera in a browser however the stream is just not displaying. Either I get a single frame or just a black screen. I get the same issues on safari on the mac or any browser on an iphone. However the video stream works just fine using hls.js in Windows or on Android.
We are grabbing the the RTSP stream from the camera and using ngix to serve the .m3u8 url. However even if we save the stream to a file and try an play it on the iphone it has the same issue (unless we use a separate media player like VLC).
I know if we use ffmpeg to reencode as H264 rather than copy it the it will play. My guess there is an incompatibility between how uniview encode the video and what apple can accept.
I've asked uniview and they are not sure what the problem is either.
Is there a way to get more debug information on why a particular HLS stream is failing in safari on mac or iPhone.
Topic:
Media Technologies
SubTopic:
Streaming
Hi,
I am writing to seek any help or workaround regarding an issue I have encountered while implementing Low-Latency HLS playback using the AVAssetResourceLoaderDelegate.
I have been successfully loading playlists during HLS live playback using the AVAssetResourceLoaderDelegate. However, after introducing Low-Latency HLS, I have run into a problem.
When the AVPlayer loads low-latency content playback natively, everything works fine. But when I use the delegate for loading, I encounter the following error from AVPlayer's status observer:
CoreMediaErrorDomain -15410
Low Latency: Server must support http2 ECN and SACK
It seems there is no problem since playback does not stop, but there is a very critical part missing. The playback does not achieve the expected low latency and behaves similarly to standard HLS.
Additionally, this behavior only occurs on iOS 16 devices and simulators. On iOS 17 simulators and devices, the error message does not appear, and the latency remains low as expected.
Therefore, I suspect that there might be some misjudgment in the verification process within the internal implementation of AVPlayer.
Since our app needs to support iOS 16, I would appreciate any solutions, methods to try, or workarounds that you could share regarding this issue.
Thank you.
Hi there,
After upgrading to iOS 18, I noticed that ApplicationMusicPlayer.Queue behavior has broken if at least one song that is added to the queue is also in to the Apple Music Library on the device.
The resulting behavior is that the queue does not accept all the items, and only items that are in the Library are playable in the queue.
The expected behavior and the previous behavior on iOS 17 was that all the items would be added to the queue successfully. I confirmed this behavior on a separate test device running iOS 17.7.
The items added are all being fetched via MusicCatalogResourceRequest<Song> so I would expect that a requested song being present in the library would have no effect.
I use AVPlayer to play HLS video successfully on macOS Sonoma, but I encountered this error on macOS Sequoia. Please help me:
Error Domain=AVFoundationErrorDomain Code=-11833 ‘Cannot Decode’ UserInfo={NSUnderlyingError=0x600001e57330 {Error Domain=CoreMediaErrorDomain Code=-12906 ‘(null)’}, NSLocalizedFailureReason=The decoder required for this media cannot be found., AVErrorMediaTypeKey=vide, NSLocalizedDescription=Cannot Decode}
Thanks!
Case-ID: 9391388
Our application uses timed Metadata as part of a rating control system.
We noticed a problem in production and diagnosis shows that we stop receiving timed Metadata on iOS18 only
Our live streams are primed with metadata at least once per second but we are seeing extended gaps in receiving this content, in excess of 10 minutes.
We have also observed that this happens more as the player climbs the bitrate ladder, and doesn't happen if we cap to a low resolution
i.e. a preferredMaximumResolution of 768x432.
Furthermore, if we throttle network conditions after we stop receiving metadata the we start receiving them again.
Following is a simple example that demonstrates the above behaviour, unfortunately I cannot share the live stream endpoint which is primed with metadata publicly, but can provide privately to Apple to reproduce the problem.
import UIKit
import AVKit
class ViewController: UIViewController, AVPlayerItemMetadataOutputPushDelegate {
var player: AVPlayer?
var itemMetadataOutput: AVPlayerItemMetadataOutput?
override func viewDidAppear(_ animated: Bool) {
guard let url = URL(string: "endpoint redacted") else {
return
}
let player = AVPlayer(url: url)
let controller = AVPlayerViewController()
controller.player = player
self.player = player
present(controller, animated: true) {
player.play()
let currentItem = player.currentItem
let itemMetadataOutput = AVPlayerItemMetadataOutput(identifiers: nil)
self.itemMetadataOutput = itemMetadataOutput
self.itemMetadataOutput?.setDelegate(self, queue: .main)
currentItem?.add(itemMetadataOutput)
}
}
public func metadataOutput(_ output: AVPlayerItemMetadataOutput,
didOutputTimedMetadataGroups groups: [AVTimedMetadataGroup],
from track: AVPlayerItemTrack?) {
print("received metadata \(Date())")
}
}
Hello,
I submitted a Request for a Deployment Package. However, I have not received any emails yet. Could you please let me know when I can expect a confirmation message?
Thank you.
I cannot mirror or extend my screen from mac mini m2 to iPad 10 gen. Whenever I click on "mirror or extend screen" my external display for mac refreshes after showing "no signal" and comes back on meanwhile my iPad locks out and screen mirror or extending is unsuccessful. But I can mirror my iPad screen to mac mini m2. Earlier everything was working, suddenly it is not working
iOS18 Added support for WebRTC HEVC RFC 7789 RTP Payload Format. (112001659),How can i determine whether iOS 18 WebRTC uses hardware or software decoding for HEVC?
Hello, we have HLS stream app and we use AVPlayer for HLS stream. We want to implement dynamic resulotion feature as user's selection. For example, if user want to watch only 1080p user has to watch only 1080p but we have tried to implement "preferredMaximumResolution" and "preferredPeakBitRate" parameters and but AVPlayer does not force it which means that setting preferredMaximumResolution= CGSize(width: 1920, height: 1080) player does not only force to play 1080p profile, player drops resulotion to 720p but we do not want 720p stream if user selected 1080p resulotion. Is there any method to force it even if stream stalls? Thank you in advance
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
Media Player
Video
HTTP Live Streaming
We are experiencing an issue with our HLS MPEG-TS streams on Apple devices, where the AVPlayer in our iOS app and Safari jumps back to the start when the player automatically changes quality. This occurs despite the stream still indicating that it is live and there is no change in the seekbar. After testing our streams with the Apple HLS Validator, the only problem that occured was an "Measured peak bitrate compared to multivariant playlist declared value exceeds error tolerance"-Error.
On Chrome and on our Android-App this playback bug does not happen. Has someone else experienced similar issues with the AVPlayer?
We're integrating a web based group calling application within a native iOS application and finding that every time a CallKit session gets fully established the web based media streams break, rendering as gray with no audio.
Up to iOS 18 we worked around it by not fulfilling the call start action but that's no longer an option as the audio stopped getting automatically redirected to the speakers. We would now need the CXProvider's didActivateAudioSession callback but that would break the video.
The sample project loads up a simple webpage in a WKWebView which contains a video tag streaming the media from the device's camera.
At the same time it sets up a new CallKit session by requesting and fulfilling a CXStartCallAction transaction.
You will notice that the media doesn't render and, if you are to follow the warnings we left, you will find that not fulfilling the CXStartCallAction fixes it.
Unfortunately that's not a workaround we can use as we need the CXProvider delegate to inform us about audio session changes so we can redirect the audio to the speaker (so the proximity sensor doesn't activate and locking the screen doesn't end the call)
Any insights or workarounds would be greatly appreciated.
Hello,
I used AVPlayer in my project to play network movie.
Most movie could play normally, but I found the sound will disappear sometimes if I play specified 4K video network stream.
The video will continue playing but audio stops after video is played for a while.
If I pause player and then resume, the sound will be back but disappeared again after several seconds
Check AVPlayerItem status:
isPlaybackLikelyToKeepUp` == true
isPlaybackBufferEmpty` = false
player.volume > 0
According the value above, it seems not cause by empty playback buffer or volume issue. I am so confused for this situation.
Movie information
Video
Format : AVC
Format/Info : Advanced Video Codec
Format profile : High L5.1
Codec ID : avc1
Codec ID/Info : Advanced Video Coding
Bit rate mode : Variable
Bit rate : 100.0 Mb/s
Width : 3 840 pixels
Height : 2 160 pixels
Display aspect ratio : 16:9
Frame rate mode : Constant
Frame rate : 29.970 (30000/1001) FPS
Audio
Format : AAC LC
Format/Info : Advanced Audio Codec Low Complexity
Codec ID : mp4a-40-2
Duration : 5 min 19 s
Bit rate mode : Constant
Bit rate : 192 kb/s
Nominal bit rate : 48.0 kb/s
Channel(s) : 2 channels
Channel layout : L R
Sampling rate : 48.0 kHz
Frame rate : 46.875 FPS (1024 SPF)
Does anyone know if AVPlayer has this limitations when playing high-bitrate movie streams, and are there any solutions?
HLS live streaming 4k is not displaying any video, but is streaming audio.
Getting the following errors in the console where it shows that it is failing to decode every frame.
Can I get some help as to what these error codes refer to and why it would fail to decode?
08:30:42.675879-0800 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): avdDec - Frame# 3588, DecodeFrame failed with error: 0x196
08:30:42.675908-0800 videocodecd AppleAVD: AppleAVDDisplayCallback(): Asking fig to drop frame # 3588 with err -12909 - internalStatus: 315
08:30:42.697412-0800 videocodecd AppleAVD: AppleAVDDecodeFrameResponse(): Frame# 3589 DecodeFrame failed with error 0x00000196
08:30:42.697876-0800 videocodecd AppleAVD: AppleAVDDecodeFrameInternal(): failed - error: 406
Hi,
I have a usecase where I'd like to handle and prevent automatic retries whenever certain errors occur during FairPlay content key requests.
Here's the current flow:
FairPlay certificate is requested and obtained from my server
makeStreamingContentKeyRequestData is called on the keyRequest
The license server will return a 403 along with a body response containing a json with the detailed code and message
The error is caught and handled properly by calling AVContentKeyRequest.processContentKeyResponseError
The AVContentKeySession automatically retries up to 8 times by providing a new key request through public func contentKeySession(_ session: AVContentKeySession, didProvide keyRequest: AVContentKeyRequest)
My license server gets hit with 8 requests that will always result in a 403, these retries are useless
My custom error is succesfully caught later down the line through AVPlayerItem.observe(\.status), this is great
Thing is.. I'd like to catch the 403 error and prevent any retry from being made at step 5, ideally through
public func contentKeySession(_ session: AVContentKeySession, contentKeyRequest keyRequest: AVContentKeyRequest, didFailWithError err: Error)
I've looked for quite a while and just can't seem to find any way of achieving this. Is this not supported at all?
Hi,
I have a IOS app and we are using fairplay DRM to play videos. In IOS app we are allowing offline download of the videos and hence we are getting a persistent fairplay license. In IOS app everything is working fine.
Now we have used the same app and built for MacOS catalyst. In MAC OS catalyst app we are not able to play the video and getting error code -42650
We are able to get the persistent license from server, but when we play the video with the license we are getting the error. Below are the logs:
2024-12-06 22:05:48.911266+0530 0x4dffe2 Default 0x0 85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:] <<<< FigPKDKeyManager >>>> keyManager_processOfflineKeyInternal: 0x600000322000 160D4519-C60B-4FD0-B69A-20B2A4597017 created decrypt context:0x0 with offline key; updated offline key:0x0 err:-42650
2024-12-06 22:05:48.911369+0530 0x4dffe2 Default 0x0 85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:player] <<<< FigStreamPlayer >>>> fpfs_ensureDecryptorHasStarted: [0x7fc44e4dc520|P/NW] <0x7fc44fa44000|I/SRA.01>: track 1 latching decryptorFailure -42650
85505 0 teachonline: (MediaToolbox) [com.apple.coremedia:player] <<<< FigStreamPlayer >>>> fpfs_StopPlayingItem: [0x7fc44e4dc520|P/NW] <0x7fc44fa44000|I/SRA.01>: Pausing, err=Error Domain=CoreMediaErrorDomain Code=-42650 "(null)"
I have copied only the lines which has errors. You can download the full logs from https://drive.google.com/file/d/1feb9pKZERUr--PMt6m-6IrO_mDvoFbjO/view?usp=sharing
Can you please help me to fix the issue.
In an m3u8 manifest, audio EXT-X-MEDIA tags usually contain CHANNELS tag containing the audio channels count like so:
#EXT-X-MEDIA:TYPE=AUDIO,URI="audio_clear_eng_stereo.m3u8",GROUP-ID="default-audio-group",LANGUAGE="en",NAME="stream_5",AUTOSELECT=YES,CHANNELS="2"
Is it possible to get this info from AVPlayer, AVMediaSelectionOption or some related API?
Hi, I'm trying to decode a HLS livestream with VideoToolbox. The CMSampleBuffer is successfully created (OSStatus == noErr). When I enqueue the CMSampleBuffer to a AVSampleBufferDisplayLayer the view isn't displaying anything and the status of the AVSampleBufferDisplayLayer is 1 (rendering).
When I use a VTDecompressionSession to convert the CMSampleBuffer to a CVPixelBuffer the VTDecompressionOutputCallback returns a -8969 (bad data error).
What do I need to fix in my code? Do I incorrectly parse the data from the segment for the CMSampleBuffer?
let segmentData = try await downloadSegment(from: segment.url)
let (sps, pps, idr) = try parseH264FromTSSegment(tsData: segmentData)
if self.formatDescription == nil {
self.formatDescription = try CMFormatDescription(h264ParameterSets: [sps, pps])
}
if let sampleBuffer = try createSampleBuffer(from: idr, segment: segment) {
try self.decodeSampleBuffer(sampleBuffer)
}
func parseH264FromTSSegment(tsData: Data) throws -> (sps: Data, pps: Data, idr: Data) {
let tsSize = 188
var pesData = Data()
for i in stride(from: 0, to: tsData.count, by: tsSize) {
let tsPacket = tsData.subdata(in: i..<min(i + tsSize, tsData.count))
guard let payload = extractPayloadFromTSPacket(tsPacket) else { continue }
pesData.append(payload)
}
let nalUnits = parseNalUnits(from: pesData)
var sps: Data?
var pps: Data?
var idr: Data?
for nalUnit in nalUnits {
guard let firstByte = nalUnit.first else { continue }
let nalType = firstByte & 0x1F
switch nalType {
case 7: // SPS
sps = nalUnit
case 8: // PPS
pps = nalUnit
case 5: // IDR
idr = nalUnit
default:
break
}
if sps != nil, pps != nil, idr != nil {
break
}
}
guard let validSPS = sps, let validPPS = pps, let validIDR = idr else {
throw NSError()
}
return (validSPS, validPPS, validIDR)
}
func extractPayloadFromTSPacket(_ tsPacket: Data) -> Data? {
let syncByte: UInt8 = 0x47
guard tsPacket.count == 188, tsPacket[0] == syncByte else {
return nil
}
let payloadStart = (tsPacket[1] & 0x40) != 0
let adaptationFieldControl = (tsPacket[3] & 0x30) >> 4
var payloadOffset = 4
if adaptationFieldControl == 2 || adaptationFieldControl == 3 {
let adaptationFieldLength = Int(tsPacket[4])
payloadOffset += 1 + adaptationFieldLength
}
guard adaptationFieldControl == 1 || adaptationFieldControl == 3 else {
return nil
}
let payload = tsPacket.subdata(in: payloadOffset..<tsPacket.count)
return payloadStart ? payload : nil
}
func parseNalUnits(from h264Data: Data) -> [Data] {
let startCode = Data([0x00, 0x00, 0x00, 0x01])
var nalUnits: [Data] = []
var searchRange = h264Data.startIndex..<h264Data.endIndex
while let range = h264Data.range(of: startCode, options: [], in: searchRange) {
let nextStart = h264Data.range(of: startCode, options: [], in: range.upperBound..<h264Data.endIndex)?.lowerBound ?? h264Data.endIndex
let nalUnit = h264Data.subdata(in: range.upperBound..<nextStart)
nalUnits.append(nalUnit)
searchRange = nextStart..<h264Data.endIndex
}
return nalUnits
}
private func createSampleBuffer(from data: Data, segment: HLSSegment) throws -> CMSampleBuffer? {
var blockBuffer: CMBlockBuffer?
let alignedData = UnsafeMutableRawPointer.allocate(byteCount: data.count, alignment: MemoryLayout<UInt8>.alignment)
data.copyBytes(to: alignedData.assumingMemoryBound(to: UInt8.self), count: data.count)
let blockStatus = CMBlockBufferCreateWithMemoryBlock(
allocator: kCFAllocatorDefault,
memoryBlock: alignedData,
blockLength: data.count,
blockAllocator: nil,
customBlockSource: nil,
offsetToData: 0,
dataLength: data.count,
flags: 0,
blockBufferOut: &blockBuffer
)
guard blockStatus == kCMBlockBufferNoErr, let validBlockBuffer = blockBuffer else {
alignedData.deallocate()
throw NSError()
}
var sampleBuffer: CMSampleBuffer?
var timing = [calculateTiming(for: segment)]
var sampleSizes = [data.count]
let sampleStatus = CMSampleBufferCreate(
allocator: kCFAllocatorDefault,
dataBuffer: validBlockBuffer,
dataReady: true,
makeDataReadyCallback: nil,
refcon: nil,
formatDescription: formatDescription,
sampleCount: 1,
sampleTimingEntryCount: 1,
sampleTimingArray: &timing,
sampleSizeEntryCount: sampleSizes.count,
sampleSizeArray: &sampleSizes,
sampleBufferOut: &sampleBuffer
)
guard sampleStatus == noErr else {
alignedData.deallocate()
throw NSError()
}
return sampleBuffer
}
private func decodeSampleBuffer(_ sampleBuffer: CMSampleBuffer) throws {
guard let formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer) else {
throw NSError()
}
if decompressionSession == nil {
try setupDecompressionSession(formatDescription: formatDescription)
}
guard let session = decompressionSession else {
throw NSError()
}
let flags: VTDecodeFrameFlags = [._EnableAsynchronousDecompression, ._EnableTemporalProcessing]
var flagOut = VTDecodeInfoFlags()
let status = VTDecompressionSessionDecodeFrame(
session,
sampleBuffer: sampleBuffer,
flags: flags,
frameRefcon: nil,
infoFlagsOut: nil)
if status != noErr {
throw NSError()
}
}
private func setupDecompressionSession(formatDescription: CMFormatDescription) throws {
self.formatDescription = formatDescription
if let session = decompressionSession {
VTDecompressionSessionInvalidate(session)
self.decompressionSession = nil
}
var decompressionSession: VTDecompressionSession?
var callback = VTDecompressionOutputCallbackRecord(
decompressionOutputCallback: decompressionOutputCallback,
decompressionOutputRefCon: Unmanaged.passUnretained(self).toOpaque())
let status = VTDecompressionSessionCreate(
allocator: kCFAllocatorDefault,
formatDescription: formatDescription,
decoderSpecification: nil,
imageBufferAttributes: nil,
outputCallback: &callback,
decompressionSessionOut: &decompressionSession
)
if status != noErr {
throw NSError()
}
self.decompressionSession = decompressionSession
}
let decompressionOutputCallback: VTDecompressionOutputCallback = { (
decompressionOutputRefCon,
sourceFrameRefCon,
status,
infoFlags,
imageBuffer,
presentationTimeStamp,
presentationDuration
) in
guard status == noErr else {
print("Callback: \(status)")
return
}
if let imageBuffer = imageBuffer {
}
}
I have a low latency hls with fragmented mp4 setup. When I try to validate it with mediavalidatorstream tool, it gives following error:
Detail: '(null)' is not a valid URL
Source: media playlisturl - segment url in that playlist
What does that error mean?
Topic:
Media Technologies
SubTopic:
Streaming
I try to validate low latency HLS fragmented MP4 setup with meadistreamvalidator. I get following error:
Error: Invalid URL
Detail: '(null)' is not a valid URL
Source: mediaplaylistURL.m3u8 - segmentURL.mp4
meadistreamvalidator version is 1.23.14
What does that error mean?
Topic:
Media Technologies
SubTopic:
Streaming