Subject: Handling Z-Up Blender USDZ Models in RealityKit (visionOS) for Transform Updates

Hello everyone,

I'm working on a visionOS application using RealityKit and am encountering a common coordinate system challenge when integrating 3D models created in Blender.

My goal is to display and dynamically update the Transform (position, rotation, scale) of models created in Blender within RealityKit.

The issue arises because Blender's default coordinate system is Z-up, and while exporting to USD/USDZ, I don't have a reliable "Y-up" export option that correctly reorients the model and its transform data for RealityKit's Y-up convention. This means I'm essentially exporting models with their "up" direction along the Z-axis.

When I load these Z-up exported models into RealityKit, they are often oriented incorrectly. To then programmatically update their Transform (e.g., move them, rotate them based on game logic, or apply physics), I need to ensure that the Transform values I set align with RealityKit's Y-up system, even though the original model data was authored in a Z-up context.

My questions are:

What is the recommended transformation process (e.g., using simd_quatf or simd_float4x4) to convert a Transform that was conceptually defined in a Z-up coordinate system to RealityKit's Y-up coordinate system? Specifically, when I have a Transform (or its translation, rotation, scale components) from a Z-up context, how should I apply this to a RealityKit Entity so it appears and behaves correctly in a Y-up world?

Are there any existing convenience APIs or helper functions within RealityKit, simd, or other Apple frameworks that simplify this Z-up to Y-up Transform conversion process? Or is a manual application of a transformation quaternion (e.g., simd_quatf(angle: -.pi / 2, axis: [1, 0, 0])) the standard approach?

Any guidance, code examples, or best practices from those who have faced similar challenges would be greatly appreciated!

Thank you.

Hello @u_kudo ! You've encountered a common issue!

Blender and some other DCCs use a different axis for up by default (Blender uses Z for "up", RCP uses Y). These DCCs usually have export options that can override this behavior, and that is an option you can try. In Blender, this option is in the export as USD window under "Convert Orientation". When a USD is exported this way from Blender and then imported into RCP, it will appear rotated as you'd expect...

However! If you inspect the Transform component in RCP you'll notice all RCP has done is apply a -90 degree rotation on the X axis to the imported model. When Blender exports a USD with the converted orientation, all it really does is insert some meta data into the USD file specifying the up direction. The primitives in your model, like individual vertices, are unmodified.

In my experience, it is equally easy to apply a -90 degree rotation manually in RCP than to do it Blender (via the convert orientation option in the export dialog), but I think this is heavily dependent on your workflow and asset pipeline. Maybe it makes more sense for artists to apply the rotation in Blender than for a designer to apply the rotation in RCP. The end result is exactly the same, it is more a matter of organizational efficency. I would lean toward applying the rotation in RCP, rather than placing the burden on the person not working in RCP.

There is no solution, short of rotating everything and applying all transforms in Blender before export, that will export the USD from a DCC that uses a different axis for up (like Blender) so that RealityKit and RCP ingest that asset with the up axis that you expect. You will always need to apply some rotation to your source asset (usually -90 degrees on the X axis) to fix the issue. I understand this can lead to some gotchas when you forget about this rotation. For example, when using generateStaticMesh(from:) with a mesh created in Blender, the generated mesh will also need to be offset by the -90 degree rotation. For this reason, I recommend detailing your usecase and submitting your feedback about the current state of the API using Feedback Assistant.

I recommend reviewing the section "Structure your project for development" in the Petite Asteroids sample code article. For this project, we organize our assets into separate Source Assets and Game Assets folders. In Source Assets, we placed unmodified USDs, textures, etc. In Game Assets, we placed USDAs where we bring together individual pieces with their materials and game logic, and you'll notice we apply the -90 rotation to the meshes on these assets. Then we bring together the game assets into our final scenes, and the developer organizing the final scenes should never have to worry about this awkward rotation, since that's all done in the game assets layer.

There are many ways to solve this problem so I wouldn't claim this as the "best" solution, but what I've described here worked for us during the development of Petite Asteroids. Thank you for your question!

Subject: Handling Z-Up Blender USDZ Models in RealityKit (visionOS) for Transform Updates
 
 
Q