RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit subtopic

Post

Replies

Boosts

Views

Activity

RealityKit and USDZ: Winding Order Issue with Negatively Scaled Meshes
Hi all, I've encountered a potential issue with how the winding order of geometry is handled when their transformations involve negative scaling. I created a simple test asset, a single triangle, to demonstrate this. The triangle's vertices are defined in a counter-clockwise ("right-handed") winding order, and its transform has a negative scale on the X-axis. According to the OpenUSD specification, this negative determinant in the transformation matrix should effectively reverse the winding order of the geometry: However, any given gprim's local-to-world transformation can flip its effective orientation, when it contains an odd number of negative scales. This condition can be reliably detected using the (Jacobian) determinant of the local-to-world transform: if the determinant is less than zero, then the gprim's orientation has been flipped, and therefore one must apply the opposite handedness rule when computing its surface normals (or just flip the computed normals) for the purposes of hidden surface detection and lighting calculations. When I view the asset in tools like Blender or Preview on macOS, it behaves as expected. The triangle's effective orientation is flipped to CW. However, when the same asset is viewed in Reality Composer Pro or with QuickLook on iOS, its effective orientation remains CCW. In other words, the triangle faces the opposite direction. My questions for the community and Apple are: Is this behavior in RealityKit a known issue? If this is a known issue, is there official guidance for DCC tools on how to export USDZ assets to ensure they appear correctly in the Apple ecosystem? Any insights or recommendations would be greatly appreciated.
5
0
921
Nov ’25
VisionOS Spatial Accessory inputs help
The “explore spatial accessory input on visionOS” presentation from WDC25 interests me. I bought both the MUSE Logitech stylus and the PS VR2 sense controllers to try out with the sculpting app presented by the author, engineer Amanda Han. Unfortunately the app itself was not included. Could the app be made available for downloading as well as the Xcode project? I appreciate any assistance the author and your team could provide. Thank you.
0
0
153
Nov ’25
Skybox for iOS/swiftUI not working
I have tried every combination of suggestions to get a skybox to appear. Using swiftUI, realityKit and iOS. Non immersive environment. Does anyone have code that works to display a skybox. When i use a do/catch loop i get environmentResource not found. I have checked the syntax, ensured the folder is referencing the target, used the same name for the folder as the file, the file is a .hdr (i assume this is supported), i have moved the file folder to the top level - no change.
2
0
481
3w
RealityKit, Attachments - not working
The simplest realityView (content, attachments in ... causes Contextual closure expects 1 argument but 2 were used in closure body. I have checked every example and i cannot understand why i get this error regardless of any content. Note: i have added Attachment(id: "test") to the attachment closure and get Attachment not is scope. imported both realityKit and SwiftUI.
2
0
217
3w
Walking an entity around an immersive space in visionOS like the window drag bar
I'm trying to understand how Apple handles dragging windows around in an immersive space. 3d Gestures seem to be only half of the solution in that they are great if you're standing still and want to move the window an exaggerated amount around the environment, but if you then start walking while dragging, the amplified gesture sends the entity flying off into the distance. It seems they quickly transition from one coordinate system to another depending on if the user is physically moving. If you drag a window and start walking the movement suddenly matches your speed. When you stop moving, you can push and pull the windows around again like a super hero. Am I missing something obvious in how to copy this behavior? Hello world, which uses the 3d gesture has the same problem. You can move the world around but if you walk with it, it flies off. Are they tracking the head movement and if it's moved more than a certain amount it uses that offset instead? Is there anything out of the box that can do this before I try and hack my own solution?
2
0
1.1k
3w
VisionOS VideoMaterial on 3D Mesh
I'm trying to get video material to work on an imported 3D asset, and this asset is a USDC file. There's actually an example in this WWDC video from Apple. You can see it running on the flag in this airplane, but there are no examples of this, and there are no other examples on the internet. Does anybody know how to do this? You can look at 10:34 in this video. https://developer.apple.com/documentation/realitykit/videomaterial
2
0
941
3w
PhotogrammetrySession crashes after update from iOS 18 to iOS 26
After updating iPad/iPhone devices from iOS 18 to iOS 26, PhotogrammetrySession intermittently crashes during photogrammetry processing. The same workflow was stable on iOS 18 with no code changes to the app. Environment: OS versions: Works on OS 18, crashes on OS 26 Device: iPad/iPhone (reproducible across devices) Source images: ~170-200 JPG files at 2160 x 3840 resolution Reproduction: The crash occurs consistently on the second or third sequential run of the photogrammetry session with the same image set. First run typically succeeds. Crash details: Xcode shows an uncaught exception during image processing: terminating due to uncaught exception of type std::bad_alloc: std::bad_alloc VTPixelTransferSession 420f sid 269 (2160.00 x 3840.00) [0.00 0.00 2160 3840] rowbytes( 2160, 2160 ) Color( (null), 0x0, (null), (null), ITU_R_601_4 ) => 24 sid 19 (2160.00 x 3840.00) [0.00 0.00 2160 3840] rowbytes( 6528 ) Color( 0x0, (null), (null), (null) ) This appears to be a memory allocation failure in VTPixelTransferSession during color space conversion. Has anyone else experienced similar crashes with CorePhotogrammetry on iOS 26, or found workarounds?
0
0
236
2w
ParticleEmitterComponent Position Offset Issue After iOS 26.1 Update – Seeking Solutions & Workarounds
Problem Summary After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds. Environment Details Operating System: iOS 26.1 & iOS 26.2 Framework: RealityKit Xcode Version: 16.2 (16C5032a) Expected vs. Actual Behavior Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier. Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience. Steps to Reproduce Create or open an AR application with RealityKit that uses particle components Attach a ParticleEmitterComponent to an entity via a custom system Run the application on iOS 26.1 or iOS 26.2 Observe that particles render at an offset position away from the entity Minimal Code Example Here's the setup from my test case: Custom Component & System: struct SparkleComponent4: Component {} class SparkleSystem4: System { static let query = EntityQuery(where: .has(SparkleComponent4.self)) required init(scene: Scene) {} func update(context: SceneUpdateContext) { for entity in context.scene.performQuery(Self.query) { // Only add once if entity.components.has(ParticleEmitterComponent.self) { continue } var newEmitter = ParticleEmitterComponent() newEmitter.mainEmitter.color = .constant(.single(.red)) entity.components.set(newEmitter) } } } AR Setup: let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true) let model = Entity() model.components.set(ModelComponent(mesh: boxMesh, materials: [material])) model.components.set(SparkleComponent4()) model.position = [0, 0.05, 0] model.name = "MyCube" let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2])) anchor.addChild(model) arView.scene.addAnchor(anchor) Questions for the Community Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2? Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning? Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing? Additional Information I've already submitted this issue via Feedback Assistant(FB21346746)
0
0
273
2w
ParticleEmitterComponent Position Offset Issue After iOS 26.1 Update – Seeking Solutions & Workarounds
Problem Summary After upgrading to iOS 26.1 and 26.2, I'm experiencing a particle positioning bug in RealityKit where ParticleEmitterComponent particles render at an incorrect offset relative to their parent entity. This behavior does not occur on iOS 18.6.2 or earlier versions, suggesting a regression introduced in the newer OS builds. Environment Details Operating System: iOS 26.1 & iOS 26.2 Framework: RealityKit Xcode Version: 16.2 (16C5032a) Expected vs. Actual Behavior Expected: Particles should render at the position of the entity to which the ParticleEmitterComponent is attached, matching the behavior on iOS 18.6.2 and earlier. Actual: Particles appear away from their parent entity, creating a visual misalignment that breaks the intended AR experience. Steps to Reproduce Create or open an AR application with RealityKit that uses particle components Attach a ParticleEmitterComponent to an entity via a custom system Run the application on iOS 26.1 or iOS 26.2 Observe that particles render at an offset position away from the entity Minimal Code Example Here's the setup from my test case: Custom Component & System: struct SparkleComponent4: Component {} class SparkleSystem4: System { static let query = EntityQuery(where: .has(SparkleComponent4.self)) required init(scene: Scene) {} func update(context: SceneUpdateContext) { for entity in context.scene.performQuery(Self.query) { // Only add once if entity.components.has(ParticleEmitterComponent.self) { continue } var newEmitter = ParticleEmitterComponent() newEmitter.mainEmitter.color = .constant(.single(.red)) entity.components.set(newEmitter) } } } AR Setup: let material = SimpleMaterial(color: .gray, roughness: 0.15, isMetallic: true) let model = Entity() model.components.set(ModelComponent(mesh: boxMesh, materials: [material])) model.components.set(SparkleComponent4()) model.position = [0, 0.05, 0] model.name = "MyCube" let anchor = AnchorEntity(.plane(.horizontal, classification: .any, minimumBounds: [0.2, 0.2])) anchor.addChild(model) arView.scene.addAnchor(anchor) Questions for the Community Has anyone else encountered this particle positioning issue after updating to iOS 26.1/26.2? Are there known workarounds or configuration changes to ParticleEmitterComponent that restore correct positioning? Is this a confirmed bug, or could there be a change in coordinate system handling or transform inheritance that I'm missing? Additional Information I've already submitted this issue via Feedback Assistant(FB21346746)
0
0
411
2w
Particles rendered in the wrong order: back last instead of the back first
I've tried out a ParticleEmitter in Reality Composer Pro to produce a burst of particles that don't move (i.e. speed close to zero). When viewing from different angles, it clearly looks like the particles are rendered exactly in the wrong order, that is, front first and back last. In other words, back particles obscure front particles. I would prefer it the correct way around. I've only tried this interactively in Reality Composer Pro, not programmatically, but I assume I would get the same result. My Reality Composer Pro "File" (zipped): https://gert-rieger-edv.de/Posts/Post-1/RealityParticles.zip Screenshot: Click on the ParticleEmitter object, then on its Play button, then select the Particles tab and click on "Burst" a few times to get a few random particles. Mac Studio 2025 Apple M4 Max macOS 15.7.2 (24G325) Reality Composer Pro Version 2.0 (494.60.2)
0
0
432
1w