visionOS + Unity PolySpatial: Is 15,970 MeshFilters the True Upper Limit for Industrial Scenes?

Breaking Through PolySpatial's ~8k Object Limit – Seeking Alternative Approaches for Large-Scale Digital Twins

Confirmed: PolySpatial make Doubles MeshFilter Count – Hard Limit at ~8k Active Objects (15.9k Total)

Project Context & Research Goals

I’m developing an industrial digital twin application for Apple Vision Pro using Unity’s PolySpatial framework (RealityKit rendering in Unbounded_Volume mode). The scene contains complex factory environments with:

  • Production line equipment Many fragmented grid objects need to be merged.)
  • Dynamic product racks (state-switchable assets)
  • Animated worker avatars

To optimize performance, I’m systematically testing visionOS’s rendering capacity limits. Through controlled stress tests, I’ve identified a critical threshold:

Key Finding

When the total MeshFilter count reaches 15,970 (system baseline + 7,985 user-created objects × 2 due to PolySpatial cloning), the application crashes consistently. This suggests:

PolySpatial’s mirroring mechanism effectively doubles GameObject overhead

An apparent hard limit exists around ~8k active mesh objects in practice

Objectives for This Discussion

  1. Verify if others have encountered similar limits with PolySpatial/RealityKit

  2. Understand whether this is a:

  • Memory constraint (per-app allocation)
  • Render pipeline limit (Metal draw calls)
  • Unity-specific PolySpatial behavior
  1. Explore optimization strategies beyond brute-force object reduction

Why This Matters

Industrial metaverse applications require rendering thousands of interactive objects . Confirming these limits will help our team:

  • Design safer content guidelines
  • Prioritize GPU instancing/LOD investments
  • Potentially contribute back to PolySpatial’s optimization

I’d appreciate insights from engineers who’ve:

  • Pushed similar large-scale scenes in visionOS
  • Worked around PolySpatial’s cloning overhead
  • Discovered alternative capacity limits (vertices/draw calls)

SceneComplexityValidationSystem Crash on xrOS - Testing Mesh Limits in Industrial Digital Twin

Hi everyone,

I’m encountering a crash in my Unity app when running on Vision Pro (xrOS 2.5), specifically when adding a large number of mesh objects to the scene. Here’s what I’ve gathered from the crash log—would appreciate any insights or suggestions!

Crash Overview

  • Thread: Main thread (com.apple.main-thread)
  • Exception: EXC_CRASH (SIGABRT) (app terminated by system)
  • Trigger:
    re::ecs2::SceneComplexityValidationSystem::update(re::ecs2::Scene*, re::ecs2::System::UpdateContext) const
    
    This suggests the crash occurred during system-level scene complexity validation, likely due to exceeding hardware limits.

Key Details

  • Device: RealityDevice14,1 (Vision Pro)
  • OS: xrOS 2.5
  • Unity Version: 2.3.1 (Beta)
  • Memory Usage:
    • Virtual: ~3.7GB (857.8MB resident, per vmSummary).

Possible Causes

  1. Scene Complexity Limits:

    • The SceneComplexityValidationSystem (part of CoreRE.framework) seems to enforce thresholds for:
      • Mesh/polygon counts.
      • Rendering load (e.g., draw calls, lighting).
      • Memory/GPU resource usage.
    • Adding too many meshes may exceed these limits.
  2. Beta-Specific Issues:

    • Both Unity 2.3.1 (Beta) and xrOS 2.5 might have unoptimized resource checks.
  3. Memory Pressure:

    • The app’s high memory footprint (~3.7GB virtual) could trigger OOM (out-of-memory) safeguards.

Debugging Steps Taken

  • Optimizations Attempted:
    • Reduced mesh complexity (LODs, combining meshes).
    • Simplified materials/shaders.
    • Tested with fewer objects → crash disappears.
  • Unity Profiler:
    • No obvious bottlenecks in Editor (non-Vision Pro), suggesting the issue is xrOS-specific.

Questions for the Community

  1. Has anyone else hit SceneComplexityValidationSystem crashes on Vision Pro? Are there documented limits?
  2. Are there known workarounds (e.g., disabling certain features in Unity’s XR Plug-in)?
  3. Could this be a bug in Unity’s Vision Pro support or xrOS itself?

Next Steps

  • Will file feedback with Apple (xrOS) and Unity (Beta version).
  • Exploring dynamic loading to split scene complexity.

Log Snippet:

[re::ecs2::SceneComplexityValidationSystem] Abort: Scene complexity threshold exceeded.

Thanks in advance for any advice!

Hello @Factory-DTS , thanks for taking the time to put this together!

You mentioned you'd like to help contribute back to the development of PolySpatial-- I highly recommend bringing your discussion to Unity's developer forums, since PolySpatial is a technology owned and developed by Unity, not Apple. However, PolySpatial does use RealityKit to render content, and there is some relevant context I think I could add here.

Yes, you are encountering an upper-limit of the number of meshes allowed in a RealityKit scene. However, I strongly advise against using this limit (~8000 meshes) to gauge the performance of your app. This number is subject to change and our official guidance is to target orders of magnitude fewer total meshes in a scene at one time. See Reducing the rendering cost of RealityKit content on visionOS for more information.

I should mention that in visionOS 26, a new component is being added to RealityKit: MeshInstancesComponent. This component leverages GPU instancing to efficiently render the same mesh+material pair many times in your scene. This would certainly help your usecase of visualizng a complex industrial scene. I recommend reaching out to Unity about their plans to support this API in PolySpatial!

Thank you!

visionOS + Unity PolySpatial: Is 15,970 MeshFilters the True Upper Limit for Industrial Scenes?
 
 
Q