Render advanced 3D graphics and perform data-parallel computations using graphics processors using Metal.

Posts under Metal tag

185 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Metal useResource vs. MTLFence
Hello, I'm tracking down a bug where useResource doesn't seem to apply proper synchronization when a resource is produced by the render pass then consumed by the compute pass, but when I use MTLFence between the to signal and wait between the render/compute encoders, the artifact goes away. The resource is created with MTLHazardTrackingModeTracked and useResource is called on the compute encoder after the render pass. Metal API Validation doesn't report any warnings/errors. Am I misunderstanding the difference between the two APIs? I dug through the Metal documentation and it looks like useResource should handle synchronization given the resource has MTLHazardTrackingModeTracked but on the other hand, MTLFence should be used to ensure proper synchronization between command encoders. Can someone can clarify the difference between the two APIs and when to use them.
3
0
80
1d
WWDC25 combining metal and ML
WWDC25: Combine Metal 4 machine learning and graphics Demonstrated a way to combine neural network in the graphics pipeline directly through the shaders, using an example of Texture Compression. However there is no mention of using which ML technique texture is compressed. Can anyone point me to some well known model/s for this particular use case shown in WWDC25.
2
0
291
4d
Xcode 26 Beta 3: missing Metal Toolchain
Hello community, After unable to use Metal Toolchain for Xcode Beta 1 and 2 even after following the workarounds, and using Xcode Beta 3, I am still unable to use the downloaded Metal Compiler on my device. Building any project with metal file results in warning: Could not read serialized diagnostics file: error("Failed to open diagnostics file") (in target '<target>' from project '<project>') error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain Command CompileMetalFile failed with a nonzero exit code Here is my build environment, $ xcodebuild -version Xcode 26.0 Build version 17A5276g I have also checked the downloaded metal toolchain. $ xcodebuild -downloadComponent metalToolchain -exportPath /tmp/MyMetalExport/ 2025-07-13 13:16:17.293 xcodebuild[2153:34019] IDEDownloadableMetalToolchainCoordinator: Failed to remount the Metal Toolchain: The file “com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.3KEJwX” couldn’t be opened because you don’t have permission to view it. Beginning asset download... 2025-07-13 13:16:17.427 xcodebuild[2153:34022] IDEDownloadableMetalToolchainCoordinator: Failed to remount the Metal Toolchain: The file “com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.3KEJwX” couldn’t be opened because you don’t have permission to view it. Downloaded asset to: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/47af11e2964f385d510c6a9d1a49c8165f334a51.asset/AssetData/Restore/022-19457-052.dmg Beginning asset export... Done exporting: /tmp/MyMetalExport/MetalToolchain-17A5276g.exportedBundle Done downloading: Metal Toolchain 17A5276g. The ExportMetadata.plist has a buildUpdateVersion of 17A5276g. Any suggestions are greatly appreciated.
0
2
87
6d
Xcode 26 beta 3 - Metal toolchain installed but not working
Under Xcode 26 beta 3 I'm trying to build a project which uses Metal. I've installed the Metal Toolchain 26.0 under Settings -> Components, but when I start a build it fails during the "Prepare build" step with the following error (repeated many times): stat(/var/run/com.apple.security.cryptexd/mnt/com.apple.MobileAsset.MetalToolchain-v17.1.5276.7.Pb9SLL/Metal.xctoolchain/usr/bin/clang): No such file or directory (2) I've confirmed that there is in fact no 'clang' binary in that directory. I've tried using xcode-select to set the Xcode 26 Beta app as the active developer directory, and xcodebuild -version shows: Xcode 26.0 Build version 17A5276g Any ideas on other things to try?
4
1
136
1w
How to define vertex shader and fragment shader code for vision
Hello experts, I have implemented 3d guassian splating rendering through the CompositorServices framework, but I want to port it to the RealityKit framework. I am not sure whether RealityKit meets the requirements. In order to port the code, I need to render a large number of instanced planes (it is necessary to support instantiated Mesh), and completely custom vertex shader and fragment shader code, including using instance_id and discarding pixels (Call the discard_fragment method). Some Metal code snippets: vertex ColorInOut splatVertexShader(uint vertexID [[vertex_id]], uint instanceID [[instance_id]], ushort amp_id [[amplification_id]], constant Splat* splatArray [[ buffer(BufferIndexSplat) ]], constant UniformsArray & uniformsArray [[ buffer(BufferIndexUniforms) ]], constant int32_t* splatIndices [[ buffer(BufferIndexSplatIndex) ]]) { // ... (do some calculations) } fragment half4 splatFragmentShader(ColorInOut in [[stage_in]]) { // ... if (/* a certain condition */) { discard_fragment(); } } I have learned that CustomMaterial does not support vision, and Shader Graph does not support custom vertex/fragment shader code. I also noticed the LowLevelMesh interface, but I didn't find any sample code with custom shader code, so it's not clear whether it would be able to achieve what I need. Can these requirements be met using reality?
0
0
321
1w
CAMetalLayer nextDrawable crash
Hi , My application meet below crash backtrace at very low repro rate from the public users, i do not see it relate to a specific iOS version or iPhone model. The last code line from my application is calling CAMetalLayer nextDrawable API. I did some basic studying, suppose it may relate to the wrong CAMetaLayer configuration, like frame property w or h <= 0.0 bounds property w or h <= 0.0 drawableSize w or h <= 0.0 or w or h > max value (like 16384) Not sure my above thinking is right or not? Will the UIView which my CAMetaLayer attached will cause such nextDrawable crash or not ? Thanks a lot Main Thread - Crashed libsystem_kernel.dylib __pthread_kill libsystem_c.dylib abort libsystem_c.dylib __assert_rtn Metal MTLReportFailure.cold.1 Metal MTLReportFailure Metal _MTLMessageContextEnd Metal -[MTLTextureDescriptorInternal validateWithDevice:] AGXMetalA13 0x245b1a000 + 4522096 QuartzCore allocate_drawable_texture(id<MTLDevice>, __IOSurface*, unsigned int, unsigned int, MTLPixelFormat, unsigned long long, CAMetalLayerRotation, bool, NSString*, unsigned long) QuartzCore get_unused_drawable(_CAMetalLayerPrivate*, CAMetalLayerRotation, bool, bool) QuartzCore CAMetalLayerPrivateNextDrawableLocked(CAMetalLayer*, CAMetalDrawable**, unsigned long*) QuartzCore -[CAMetalLayer nextDrawable] SpaceApp -[MetalRender renderFrame:] MetalRenderer.mm:167 SpaceApp -[FrameBuffer acceptFrame:] VideoRender.mm:173 QuartzCore CA::Display::DisplayLinkItem::dispatch_(CA::SignPost::Interval<(CA::SignPost::CAEventCode)835322056>&) QuartzCore CA::Display::DisplayLink::dispatch_items(unsigned long long, unsigned long long, unsigned long long) QuartzCore CA::Display::DisplayLink::dispatch_deferred_display_links(unsigned int) UIKitCore _UIUpdateSequenceRun UIKitCore schedulerStepScheduledMainSection UIKitCore runloopSourceCallback CoreFoundation __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ CoreFoundation __CFRunLoopDoSource0 CoreFoundation __CFRunLoopDoSources0 CoreFoundation __CFRunLoopRun CoreFoundation CFRunLoopRunSpecific GraphicsServices GSEventRunModal UIKitCore -[UIApplication _run] UIKitCore UIApplicationMain
3
0
258
1w
Metal Compositor Service & Persona (VisionOS)
Hello, I'm currently trying to make a collaborative app. But it just works only on Reality View, when I tried to use Compositor Layer like below, the personas disappeared. ImmersiveSpace(id: "ImmersiveSpace-Metal") { CompositorLayer(configuration: MetalLayerConfiguration()) { layerRenderer in SpatialRenderer_InitAndRun(layerRenderer) } } Is there any potential solution too see Personas in Metal view? Thanks in advance!
1
0
636
3d
How to customize shader code for visionos ?
Hello experts, I'm trying to implement a material with custom shader code, but I saw that visionOS doesn't allow you to inject custom Metal functions or use CustomMaterial like iOS/macOS, nor can you directly write Metal Shading Language (.metal) and use it through ShaderGraphMaterial. So my question is, if i want to implement your own shader code, how should i do it?
1
0
374
2w
After playing an HDR video on iPhone for a while, the HDR effect disappears and the screen brightness decrease
When i use AVPlayer to obtain the video frame CVPixelBufferRef of an HDR video, and use AVSampleBufferDisplayLayer to display it on the screen, after a period of time, the HDR video content and screen gradually darken, losing the HDR effect. Steps to reproduce: Create an AVPlayer to loop an HDR video, specify the video frame format as kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange Create a timer to get the video frame CVPixelBufferRef at 30 frames per second Use AVSampleBufferDisplayLayer to display CVPixelBufferRef on the screen Don't operate the phone, wait for a period of time (such as 40 minutes), the HDR effect disappears and the screen darkens Note: You need to use an iPhone device, iOS 18.5 and below operating system You need to ensure that the HDR video is played in a loop, that is, to ensure that the screen continues to display HDR content, wait for a period of time, depending on different devices, you need to wait for 20-40 minutes. In the iPhone Photos app,the same problem will occur after playing HDR video in a loop for a long time Expected Results: When rendering HDR content for a long time, it is guaranteed that there is always an HDR effect, and the HDR content and screen will not be darkened. Current Results: After about 20-40 minutes, the HDR effect disappears and the screen darkens.
4
0
507
1w
VisionOS 26 - threadsPerThreadgroup limit causing crash on device (but not in simulator)
Hi all, I'm running into an issue with an app that previously worked fine on device using visionOS 2.0. After updating to visionOS 26, the same code runs fine in the simulator but crashes on the device with the following error: -[MTLDebugComputeCommandEncoder _validateThreadsPerThreadgroup:]:1330: failed assertion `(threadsPerThreadgroup.width(32) * threadsPerThreadgroup.height(32) * threadsPerThreadgroup.depth(1))(1024) must be <= 832. (kernel threadgroup size limit)` Is there any documented way to check or increase the allowed threadsPerThreadgroup size on Apple Vision Pro? Or any recommended workaround for this regression? Thanks in advance!
3
0
94
2w
打开显示HUD图形后,应用崩溃
hi everyone, 我们发现了一个和Metal相关崩溃。应用中使用了Metal相关的接口,在进行性能测试时,打开了设置-开发者-显示HUD图形。运行应用后,正常展示HUD,但应用很快发生了崩溃,日志主要信息如下: Incident Identifier: 1F093635-2DB8-4B29-9DA5-488A6609277B CrashReporter Key: 233e54398e2a0266d95265cfb96c5a89eb3403fd Hardware Model: iPhone14,3 Process: waimai [16584] Path: /private/var/containers/Bundle/Application/CCCFC0AE-EFB8-4BD8-B674-ED089B776221/waimai.app/waimai Identifier: Version: 61488 (8.53.0) Code Type: ARM-64 Parent Process: ? [1] Date/Time: 2025-06-12 14:41:45.296 +0800 OS Version: iOS 18.0 (22A3354) Report Version: 104 Monitor Type: Mach Exception Exception Type: EXC_BAD_ACCESS (SIGBUS) Exception Codes: KERN_PROTECTION_FAILURE at 0x000000014fffae00 Crashed Thread: 57 Thread 57 Crashed: 0 libMTLHud.dylib esfm_GenerateTriangesForString + 408 1 libMTLHud.dylib esfm_GenerateTriangesForString + 92 2 libMTLHud.dylib Renderer::DrawText(char const*, int, unsigned int) + 204 3 libMTLHud.dylib Overlay::onPresent(id<CAMetalDrawable>) + 1656 4 libMTLHud.dylib CAMetalDrawable_present(void (*)(), objc_object*, objc_selector*) + 72 5 libMTLHud.dylib invocation function for block in void replaceMethod<void>(objc_class*, objc_selector*, void (*)(void (*)(), objc_object*, objc_selector*)) + 56 6 Metal __45-[_MTLCommandBuffer presentDrawable:options:]_block_invoke + 104 7 Metal MTLDispatchListApply + 52 8 Metal -[_MTLCommandBuffer didScheduleWithStartTime:endTime:error:] + 312 9 IOGPU IOGPUNotificationQueueDispatchAvailableCompletionNotifications + 136 10 IOGPU __IOGPUNotificationQueueSetDispatchQueue_block_invoke + 64 11 libdispatch.dylib _dispatch_client_callout4 + 20 12 libdispatch.dylib _dispatch_mach_msg_invoke + 464 13 libdispatch.dylib _dispatch_lane_serial_drain + 368 14 libdispatch.dylib _dispatch_mach_invoke + 456 15 libdispatch.dylib _dispatch_lane_serial_drain + 368 16 libdispatch.dylib _dispatch_lane_invoke + 432 17 libdispatch.dylib _dispatch_lane_serial_drain + 368 18 libdispatch.dylib _dispatch_lane_invoke + 380 19 libdispatch.dylib _dispatch_root_queue_drain_deferred_wlh + 288 20 libdispatch.dylib _dispatch_workloop_worker_thread + 540 21 libsystem_pthread.dylib _pthread_wqthread + 288 我们测试了几个不同的机型,只有iPhone 13 Pro Max会发生崩溃。 Q1:为什么会发生这个崩溃? Q2:相同的逻辑,为什么仅在iPhone 13 Pro Max机型上出现崩溃? 期待您的解答。
1
0
58
1w
'tangents' was deprecated in visionOS 2.0: Use cp_drawable_compute_projection instead
I'm using a class with tangents to render on RealityKit for VisionOS but in Vision26 it cause a crash on App and there not documentation how implement cp_drawable_compute_projection I have tried a few options but without success. Could you help me to implement it ? The part of code is: return drawable.views.map { view in let userViewpointMatrix = (simdDeviceAnchor * view.transform).inverse let projectionMatrix = ProjectiveTransform3D( leftTangent: Double(view.tangents[0]), rightTangent: Double(view.tangents[1]), topTangent: Double(view.tangents[2]), bottomTangent: Double(view.tangents[3]), nearZ: Double(drawable.depthRange.y), farZ: Double(drawable.depthRange.x), reverseZ: true ) let screenSize = SIMD2(x: Int(view.textureMap.viewport.width), y: Int(view.textureMap.viewport.height)) return ModelRendererViewportDescriptor(viewport: view.textureMap.viewport, projectionMatrix: .init(projectionMatrix), viewMatrix: userViewpointMatrix * translationMatrix * rotationMatrix * scalingMatrix * commonUpCalibration, screenSize: screenSize) }
1
0
64
3w
MapKit with MKTileOverlay Crashes After a Time
I'm building a weather map that shows the rain on the map. I'm able to retrieve PNG images that are used as tiles to put onto the map. I then reload all the tiles on the map with each timeframe (tile set for every 10 minutes). I'm able to get the map loaded up and I'm able to place the tiles and reload the data for each time slot. I preload all the PNG data needed for the tiles and store that NSData for them in memory so that they are quick for loading and showing on the map. I have timer's set to reload the overlay with the next set of tiles for each time slot. Giving the view of a moving precipitation map over time (just like you'd see on any weather map.) I have 12 time slots (timestamps) showing every 10 minutes for the past 2 hours. I have it showing each in sequence and then repeating. Over time I get a crash with this error as a Thread 1: signal SIGABRT. Failed to acquire drawable, rendering to temporary texture validateRenderPassDescriptor:782: failed assertion `RenderPass Descriptor Validation MTLRenderPassAttachmentDescriptor MTLStoreActionMultisampleResolve store action at attachment 0 requires resolve texture ' validateRenderPassDescriptor:782: failed assertion `RenderPass Descriptor Validation MTLRenderPassAttachmentDescriptor MTLStoreActionMultisampleResolve store action at attachment 0 requires resolve texture ' Through some searching I've discovered that this seems to be console output from Metal. I assume Metal is used for MapKit to render the overlay tiles? I'm using the same custom overlay where I set the timestamp on it and then tell it to reload. I also reuse the same MKOverlayRenderer as shown here... - (MKOverlayRenderer*)mapView:(MKMapView*)mapView rendererForOverlay:(id<MKOverlay>)overlay { if ([overlay isKindOfClass:[MKTileOverlay class]]) { if (!self.rainRenderer) { self.rainRenderer = [[MKTileOverlayRenderer alloc] initWithTileOverlay:overlay]; self.rainRenderer.alpha = 0.5; } return self.rainRenderer; } return nil; } And here's the function that reloads the overlay... - (void) updateRainFrame { self.currentFrameIndex = (self.currentFrameIndex + 1) % self.timestamps.count; if ((self.currentFrameIndex >= 0) && (self.timestamps.count > self.currentFrameIndex)) { NSLog (@"self.currentFrameIndex = %lu", self.currentFrameIndex); NSString *timestamp = self.timestamps[self.currentFrameIndex]; [self.overlay setTimestamp:timestamp]; [self.rainRenderer reloadData]; } } The time it takes to crash seems arbitrary. Sometimes it's very quick. Less than a minute. But usually it's several minutes. 10 or 20 minutes or more. Feels like some sort of race condition that's occurring. Perhaps ARC is not able to release the images for the tiles quick enough for each overlay reload? That's a wild guess but I think it's something more deeper in Metal as I feel I would see other errors related to memory availability. Some of my searches point to something about MSAA needing to be turned off in Metal to resolve this. However I have no idea how I would do that through MapKit. Any suggestions? Let me know if there is somehow a way to capture more from the crash to give more insight.
2
0
60
Jun ’25
[CoreImage] OS 26 breaks Metal kernels for CIFilters
I maintain a couple of CoreImage libraries that provide custom Metal kernel backed CIFilters. In iOS/iPadOS 26, the CIColorKernel.apply() method invoked in the CIFilter subclass fails to add the coreimage::destination parameter to the Metal function call: -[CIColorKernel applyWithExtent:arguments:options:] argument count mismatch for kernel 'FractalNoise3D', expected 13 but saw 12. I've compiled the code with Xcode 26 and deployed to iOS 18 devices without any breakage, so this is definitely an iOS problem, not an Xcode problem. Library here: https://github.com/JoshuaSullivan/SimplexNoiseFilter Feedback ID: FB17874311
1
0
75
4w
HDR video & screen brightness
When I play an HDR video in the iPhone Photos app, I can see the HDR effect obviously. But if this HDR video is played continuously for more than 30-40 minutes, the HDR effect will disappear and the brightness will be compressed to the SDR range. This issue will appear on any iPhone. Depending on the phone, it may be 20-30 minutes, or 30-40 minutes, or even a few minutes, such as iPhone 12 mini. Similarly, if I use AVPlayer to play and preview an HDR video, if it plays more than 30-40 minutes, the HDR effect will disappear and the screen brightness will dim. Also the currentEDRHeadroom will gradually decrease to 1 Note, test it with an HDR video longer than 1 hour, and if the video is short, please loop it. My question is how to avoid losing the HDR effect after 30-40 minutes when I use CAMetalLayer to render any HDR video.
1
0
71
1w
Unable to compile Core Image filter on Xcode 26 due to missing Metal toolchain
I have a Core Image filter in my app that uses Metal. I cannot compile it because it complains that the executable tool metal is not available, but I have installed it in Xcode. If I go to the "Components" section of Xcode Settings, it shows it as downloaded. And if I run the suggested command, it also shows it as installed. Any advice? Xcode Version Version 26.0 beta (17A5241e) Build Output Showing All Errors Only Build target Lessons of project StudyJapanese with configuration Light RuleScriptExecution /Users/chris/Library/Developer/Xcode/DerivedData/StudyJapanese-glbneyedpsgxhscqueifpekwaofk/Build/Intermediates.noindex/StudyJapanese.build/Light-iphonesimulator/Lessons.build/DerivedSources/OtsuThresholdKernel.ci.air /Users/chris/Code/SerpentiSei/Shared/iOS/CoreImage/OtsuThresholdKernel.ci.metal normal undefined_arch (in target 'Lessons' from project 'StudyJapanese') cd /Users/chris/Code/SerpentiSei/StudyJapanese /bin/sh -c xcrun\ metal\ -w\ -c\ -fcikernel\ \"\$\{INPUT_FILE_PATH\}\"\ -o\ \"\$\{SCRIPT_OUTPUT_FILE_0\}\"' ' error: error: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain /Users/chris/Code/SerpentiSei/StudyJapanese/error:1:1: cannot execute tool 'metal' due to missing Metal Toolchain; use: xcodebuild -downloadComponent MetalToolchain Build failed 6/9/25, 8:31 PM 27.1 seconds Result of xcodebuild -downloadComponent MetalToolchain (after switching Xcode-beta.app with xcode-select) xcodebuild -downloadComponent MetalToolchain Beginning asset download... Downloaded asset to: /System/Library/AssetsV2/com_apple_MobileAsset_MetalToolchain/4d77809b60771042e514cfcf39662c6d1c195f7d.asset/AssetData/Restore/022-19457-035.dmg Done downloading: Metal Toolchain (17A5241c). Screenshots from Xcode Result of "Copy Information" Metal Toolchain 26.0 [com.apple.MobileAsset.MetalToolchain: 17.0 (17A5241c)] (Installed)
16
0
579
2d