RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit subtopic

Post

Replies

Boosts

Views

Activity

USDZ File Crashes in QuickLook on iPad 9th Gen (iPadOS 18.3) – Urgent Help Needed
Hi everyone, I’m experiencing a critical issue with USDZ files created in Reality Composer on an iPad 9th Generation (iPadOS 18.3). The files work perfectly on iPads from the 10th Generation onwards and on iPad Pros. However, on older devices like the iPad 9th Generation and older iPhones, QuickLook (file preview) crashes when opening them. This is a major issue because these USDZ files are part of an exhibition where artworks are extended with AR elements via a web page. If some visitors cannot view the 3D content, it significantly impacts the experience. What’s puzzling is that two years ago, we exported USDZ files from Reality Composer, made them available via a website, and they worked flawlessly on all devices, including older iPads and iPhones. Now, with the latest iPadOS, they consistently crash on older devices. Has anyone encountered a similar issue? Are there known limitations with QuickLook on older devices, or is there a way to optimize the USDZ files to prevent crashes? Could this be related to changes in iPadOS or RealityKit? Any advice or workaround would be greatly appreciated! Thanks in advance!
1
0
601
Feb ’25
Animating a RealityComposerPro shader's uniform input value
I'm trying to build a Shader in "Reality Composer Pro" that updates from a start time. Initially I tried the following: The idea was that when the startTime was 0, the output would be 0, but then I would set startTime from within code and this would be compared with the current GPU time, and difference used to drive another part of the shader graph: if let testEntity = root.findEntity(named: "Test"), var shaderGraphMaterial = testEntity.components[ModelComponent.self]?.materials.first as? ShaderGraphMaterial { let time = CFAbsoluteTimeGetCurrent() try! shaderGraphMaterial.setParameter(name: "StartTime", value: .float(Float(time))) testEntity.components[ModelComponent.self]?.materials[0] = shaderGraphMaterial } However, I haven't found a reference to the time the shader would be using. So now I am trying to write an EntityAction to achieve the same effect. Instead of comparing a start time to the GPU's time I'm trying to animate one of the shader's uniform input. However, I'm not sure how to specify the bind target. Here's my attempt so far: import RealityKit struct ShaderAction: EntityAction { let startValue: Float let targetValue: Float var animatedValueType: (any AnimatableData.Type)? { Float.self } static func registerEntityAction() { ShaderAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let value = simd_mix(event.action.startValue, event.action.targetValue, Float(animationState.normalizedTime)) animationState.storeAnimatedValue(value) } } } extension Entity { func updateShader(from startValue: Float, to targetValue: Float, duration: Double) { let fadeAction = ShaderAction(startValue: startValue, targetValue: targetValue) if let shaderAnimation = try? AnimationResource.makeActionAnimation(for: fadeAction, duration: duration, bindTarget: .material(0).customValue) { playAnimation(shaderAnimation) } } } ''' Currently when I run this I get an assertion failure: 'Index out of range (operator[]:line 797) index = 260, max = 8' Furthermore, even if it didn't crash I don't understand how to pass a binding to the custom shader value "startValue". Any clues of how to achieve this effect - even if it's a completely different way.
1
0
611
Feb ’25
alternative for CustomShader in visionOS
Following the post on https://developer.apple.com/documentation/realitykit/custommaterial it's simple to use shader for materials and get uniforms and params from each vertex. However it's not available for visionOS. Any alternative to use in this case? I want to write shader to fill material by myself. (I have shader experience from web, familiar with fragment shader)
1
0
456
Feb ’25
Issue with FBX import -- models are ultra small
When importing FBX animations (generated by Cinema 4d or Blender) the models come in very far way and cannot resize or zoomed in. I have tried every setting from both programs to no avail. Is there a secret to providing the right export options? When importing without animations/and rigging the model imports fine and correct size. But once motion is included, something is awry. I also tried changing base units in Converter, but did not work. I have attache my model heirarchy in C4D as well as the imported result. It appears the animation is imported, as I can see it move, but can barely see it :)
3
0
563
Feb ’25
EntityAction for MaterialBaseTint - incorrect colours
Hello, I'm writing an EntityAction that animates a material base tint between two different colours. However, the colour that is being actually set differs in RGB values from that requested. For example, trying to set an end target of R0.5, G0.5, B0.5, results in a value of R0.735357, G0.735357, B0.735357. I can also see during the animation cycle that intermediate actual tint values are also incorrect, versus those being set. My understanding is the the values of material base colour are passed as a SIMD4. Therefore I have a couple of helper extensions to convert a UIColor into this format and mix between two colours. Note however, I don't think the issue is with this functions - even if their outputs are wrong, the final value of the base tint doesn't match the value being set. I wondered if this was a colour space issue? import simd import RealityKit import UIKit typealias Float4 = SIMD4<Float> extension Float4 { func mixedWith(_ value: Float4, by mix: Float) -> Float4 { Float4( simd_mix(x, value.x, mix), simd_mix(y, value.y, mix), simd_mix(z, value.z, mix), simd_mix(w, value.w, mix) ) } } extension UIColor { var float4: Float4 { var r: CGFloat = 0.0 var g: CGFloat = 0.0 var b: CGFloat = 0.0 var a: CGFloat = 0.0 getRed(&r, green: &g, blue: &b, alpha: &a) return Float4(Float(r), Float(g), Float(b), Float(a)) } } struct ColourAction: EntityAction { let startColour: SIMD4<Float> let targetColour: SIMD4<Float> var animatedValueType: (any AnimatableData.Type)? { SIMD4<Float>.self } init(startColour: UIColor, targetColour: UIColor) { self.startColour = startColour.float4 self.targetColour = targetColour.float4 } static func registerEntityAction() { ColourAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime)) animationState.storeAnimatedValue(interpolatedColour) } } } extension Entity { func updateColour(from currentColour: UIColor, to targetColour: UIColor, duration: Double, endAction: @escaping (Entity) -> Void = { _ in }) { let colourAction = ColourAction(startColour: currentColour, targetColour: targetColour, endedAction: endAction) if let colourAnimation = try? AnimationResource.makeActionAnimation(for: colourAction, duration: duration, bindTarget: .material(0).baseColorTint) { playAnimation(colourAnimation) } } } The EntityAction can only be applied to an entity with a ModelComponent (because of the material), so it can be called like so: guard let modelComponent = entity.components[ModelComponent.self], let material = modelComponent.materials.first as? PhysicallyBasedMaterial else { return } let currentColour = material.baseColor.tint let targetColour = UIColor(_colorLiteralRed: 0.5, green: 0.5, blue: 0.5, alpha: 1.0) entity.updateColour(from:currentColour, to: targetColour, duration: 2)
1
0
571
Feb ’25
RealityKit particleEmitter delay starting when toggling isEmitting
I have a scene built up in RealityComposerPro, in which I've added a ParticleEmitter with isEmitting set to False and 'Loop' set to True. In my app, when I toggle isEmitting to True there can be a delay of a few seconds before the ParticleEmitter starts. However, if I programatically add the emitter in code at that point, it starts immediately. To be clear, I'm seeing this on the VisionOS simulator - I don't have access to a device at this time. Am I misunderstanding how to control the ParticleEmitter when I need precise control on when it starts.
1
0
547
Feb ’25
Photogrammetry requiring lidar-capable phones, curious why
Hello! I'm currently building an app where I feed images into a Photogrammetry session to create a USDZ. Pretty straightforward, works great. We've recently started some testing on older devices, and have discovered that Photogrammetry is requiring devices that have LIDAR (we've seen some console logs referencing LIDAR if we stumble through a photogrammetry process without checking isSupported first) Judging from @swredcam's posting about ReefScan from November 24 (https://developer.apple.com/forums/thread/769221) it looks like Photogrammetry did work on those non-LIDAR devices. In my own testing on an iPhone 12 mini with iOS 17, PhotogrammetrySession says it's not supported. Since we're only feeding in a sequence of photos that have never had depth data, and they process fine on pro/max devices, we're curious why this would require a LIDAR sensor to work, when it seems like it did work without LIDAR in the past. Or is there some other limitation of non-pro devices that is causing photogrammetry to not be supported (especially on today's really powerful hardware) Thanks! ++md
4
0
615
Mar ’25
Gestures not working correctly when setting the fov orientation to .horizontal
Hi there, I've discovered an issue with gesture handling in RealityKit when setting the camera’s fieldOfViewOrientation to horizontal. For instance, if I render a simple box at the center of the view with a collision shape that exactly matches its dimensions, the actual hit area behaves as if it's smaller than the box. Additionally, when attempting to drag the box away from the center, the hit area appears misaligned—offset slightly towards the center. Since the default fieldOfViewOrientation is vertical and everything works as expected in that mode, it seems that the gesture system might be assuming a vertical FOV. Given that the API allows setting it to horizontal, perhaps gestures should function correctly regardless of the orientation? Thank you!
1
0
469
Mar ’25
ModelEntity(named:in:) fails to load USD file from RealityKitContent bundle with misleading error?
My experience has been that ModelEntity(named:in:) can be used to load a USD file with a simple structure consisting of entities and model entities, and, critically, it will flatten the entity hierarchy down to a single ModelEntity, presumably reducing the number of draw calls. However, can anyone verify that the following is true? If ModelEntity(named:in:) is used to load a USD file from a RealityKit content bundle, it may fail when the USD file contains more complex data, such as shader graph material definitions, or perhaps for some other reason. I am not sure. AND the error that ModelEntity(named:in:) throws in this case is Cannot load RealityKitContent entity: Failed to find resource with name "<name>" in bundle which would literally suggest that the file does not exist, instead of what I assume the error actually is, which is "the file exists but its entity hierarchy could not be flattened to a single ModelEntity" ? Is that an accurate description of the known behavior of ModelEntity:named:in:)? I understand that I could use Entity(named:in:) instead, without the flattening feature. My question is really more about the seemingly misleading error message. Thank you for any clarification you can provide.
2
0
175
May ’25
RealityKit VideoMaterial renders pink on iOS 18
our app is live, and it appears that since the ios 18 update - the VideoMaterial renders pink / purple color instead of the video (picture attached). the audio is rendered properly. we found that it occurs on old devices: iPhone 11 & iPhone SE 2020. I've found this thread of Andy Jazz on stackoverflow: Steps to Reproduce: Create a plane for the video screen. Apply a VideoMaterial using AVPlayerItem. Anchor the model entity to an ARImageAnchor. Expected Outcome: The video should play as a material on the plane in RealityKit. Actual Outcome: On iOS 18, the plane appears pink, indicating the VideoMaterial isn’t applied. What I’ve Tried: -Verified the video URL is correct. -Checked that the AVPlayerItem and VideoMaterial are initialised correctly. -Ensured the AVPlayer is playing the video. I also tried different formats (mov / mp4 / m4v), and verifying that the video's status is readyToPlay. any suggestions?
1
0
206
Jun ’25
Can spatial scene function be used outside of the Photo App?
I'm new here so I don't know what's this function belongs to which topic... Sorry about that! I watched the WWDC stream and I am really interested in this function, I'm wondering if this function could be used in my apps. I looked up the document but I find it only support visionOS(i'm not sure about that, but I saw the demo is base on the visionOS)
2
0
211
Jun ’25
'tangents' was deprecated in visionOS 2.0: Use cp_drawable_compute_projection instead
I'm using a class with tangents to render on RealityKit for VisionOS but in Vision26 it cause a crash on App and there not documentation how implement cp_drawable_compute_projection I have tried a few options but without success. Could you help me to implement it ? The part of code is: return drawable.views.map { view in let userViewpointMatrix = (simdDeviceAnchor * view.transform).inverse let projectionMatrix = ProjectiveTransform3D( leftTangent: Double(view.tangents[0]), rightTangent: Double(view.tangents[1]), topTangent: Double(view.tangents[2]), bottomTangent: Double(view.tangents[3]), nearZ: Double(drawable.depthRange.y), farZ: Double(drawable.depthRange.x), reverseZ: true ) let screenSize = SIMD2(x: Int(view.textureMap.viewport.width), y: Int(view.textureMap.viewport.height)) return ModelRendererViewportDescriptor(viewport: view.textureMap.viewport, projectionMatrix: .init(projectionMatrix), viewMatrix: userViewpointMatrix * translationMatrix * rotationMatrix * scalingMatrix * commonUpCalibration, screenSize: screenSize) }
1
0
139
Jun ’25
How to load a USDZ inside of a Reality Composer Pro package as ModelEntity in RealityKit ?
I need a MeshResource from ModelEntity to generate a box collider, but ModelEntity fails to load USDZ files from the Reality Composer Pro (RCP) bundle. This code works for loading an Entity: // Successfully loads as generic Entity previewEntity = try await Entity(named: fileName, in: realityKitContentBundle) But this fails when trying to load as ModelEntity: // Fails to load as ModelEntity modelEntity = try await ModelEntity(named: fileName, in: realityKitContentBundle) I found this thread mentioning: "You'll likely go from USDZ to Entity which contains a MeshResource when you load/init the USDZ file." But it doesn't explain ​how to actually extract the MeshResource. Could anyone advise: How to properly load USDZ files as ModelEntity from RCP bundles? How to extract MeshResource from a successfully loaded Entity? Alternative approaches to generate box colliders if direct extraction isn't possible? Sample code for extraction or workarounds would be greatly appreciated!
1
0
182
Jun ’25
ShaderGraphMaterial.getParameter(handle:) always return nil, is this expected behavior?
I've loaded a ShaderGraphMaterial from a RealityKit content bundle and I'm attempting to access the initial values of its parameters using getParameter(handle:), but this method appears to always return nil: let shaderGraphMaterial = try await ShaderGraphMaterial(named: "MyMaterial", from: "MyFile") let namedParameterValue = shaderGraphMaterial.getParameter(name: "myParameter") // This prints the value of the `myParameter` parameter, as expected. print("namedParameterValue = \(namedParameterValue)") let handle = ShaderGraphMaterial.parameterHandle(name: "myParameter") let handleParameterValue = shaderGraphMaterial.getParameter(handle: handle) // Expected behavior: prints the value of the `myParameter` parameter, as above. // Observed behavior: prints `nil`. print("handleParameterValue = \(handleParameterValue)") Is this expected behavior? Based on the documentation at https://developer.apple.com/documentation/realitykit/shadergraphmaterial/getparameter(handle:) I'd expect getParameter(handle:) to return the value of the parameter, just as getParameter(name:) does. I've tested this on iOS 18.5 and iOS 26.0 beta 2. Assuming this getParameter(handle:) works as designed, is the following ShaderGraphMaterial extension an appropriate workaround, or can you recommend a better approach? Thank you. public extension ShaderGraphMaterial { /// Reassigns the values of all named material parameters using the handle-based API. /// /// This works around an issue where, at least as of RealityKit 26.0 beta 2 and /// earlier, `getParameter(handle:)` will always return `nil` when used to read the /// initial value of a shader graph material parameter read using /// `ShaderGraphMaterial(named:from:in:)`, whereas `getParameter(name:)` will work /// as expected. private mutating func copyNamedParametersToHandles() { for parameterName in self.parameterNames { if let value = self.getParameter(name: parameterName) { let handle = ShaderGraphMaterial.parameterHandle(name: parameterName) do { try self.setParameter(handle: handle, value: value) } catch { assertionFailure("Cannot set parameter value") } } } } }
1
0
361
Jun ’25
Why Large-Scale Model Scenes Cause Real Device Crashes
Is there any limitation in Vision Pro when loading scenes with large-scale models? ​Test Case: Asset: Composite USDA file containing ​10 individual models​ (total triangles count: ~4.2M) Simulator: Loads and renders correctly Real Device: Loads asset successfully but ​ failure during rendering phase: Environment abruptly dims System spontaneously reboots How can we resolve this issue? Below are excerpted logs preceding the crash: <<<< FigAudioSession(AV) >>>> audioSessionAVAudioSession_CopyMXSessionProperty signalled err=-19224 (kFigAudioSessionError_UnsupportedOperation) (getMXSessionProperty unsupported) at FigAudioSession_AVAudioSession.m:606 Attempted to add ornament: <MRUIPlatterOrnament: 0x10a658f00; _isInternal: YES; _displaceWindowChrome: NO; _canCaptureUI: NO; _isBeingRemoved: NO; contentAnchorPoint3D: "{0.5, 0.5, 0}"; position: <MRUIPlatterOrnamentRelativePosition: 0x105b68e70; anchorPoint: {0.5, 0.5, 1}>; rotation: "{{0, 0, 0}, 0}"; opacity: 1.000000; canFollowUser: YES; effectiveOffset: "{0, 0, 0}"; presentingViewController: 0x0; billboardingBehavior: 0x0; scalingBehavior: 0x0; relativeToParent: NO; nonHeritableDepthDisplacement: 0.000000; order: 0.000000; _window._determinedSize: {0, 0}; _window: (null)> to nil or non-supporting UIScene: <UIWindowScene: 0x10a8a0000; role: UISceneSessionRoleImmersiveSpaceApplication; persistentIdentifier: test.test:SFBSystemService-BA3A21A3-D1AB-42E2-8AF0-AE0AB83BE528; activationState: UISceneActivationStateUnattached>. No action taken. Failed to set dependencies on asset 2823930584475958382 because NetworkAssetManager does not have an asset entity for that id. apply fence tx failed (client=0x98490e18) [0x10000003 (ipc/send) invalid destination port] Failed to commit transaction (client=0xa86516e2) [0x10000003 (ipc/send) invalid destination port]
1
0
213
Jul ’25
How to implement c for vision ?
I want to use reality to create a custom material that can use my own shader and support Mesh instancing (for rendering 3D Gaussian splating), but I found that CustomMaterial does not support VisionOS. Is there any other interface that can achieve my needs? Where can I find examples?
1
0
85
Jul ’25
RealityKit not behaving as expected
This week, I developed a small multiplatform RealityKit project. I also created a demo scene in Reality Composer Pro. Afterward, I imported the local package into the project. Running the project on macOS works perfectly. However, when I tried to run it on my iPhone, I encountered a permission error indicating that it couldn’t read the package. This seems unusual to me because I assumed that the dependency is bundled into the binary file. In an attempt to resolve the issue, I pushed the RCP package to GitHub, hoping it would work. Fortunately, everything compiles successfully now, but the loading time is significantly long, and the animations don’t play on tap gestures. Could someone please help me identify the root cause of this problem?
1
0
337
Jul ’25