RealityKit

RSS for tag

Simulate and render 3D content for use in your augmented reality apps using RealityKit.

RealityKit Documentation

Posts under RealityKit subtopic

Post

Replies

Boosts

Views

Activity

RealityKit, Attachments - not working
The simplest realityView (content, attachments in ... causes Contextual closure expects 1 argument but 2 were used in closure body. I have checked every example and i cannot understand why i get this error regardless of any content. Note: i have added Attachment(id: "test") to the attachment closure and get Attachment not is scope. imported both realityKit and SwiftUI.
2
0
216
3w
Skybox for iOS/swiftUI not working
I have tried every combination of suggestions to get a skybox to appear. Using swiftUI, realityKit and iOS. Non immersive environment. Does anyone have code that works to display a skybox. When i use a do/catch loop i get environmentResource not found. I have checked the syntax, ensured the folder is referencing the target, used the same name for the folder as the file, the file is a .hdr (i assume this is supported), i have moved the file folder to the top level - no change.
2
0
481
3w
How to convert `DragGesture().onEnded`'s velocity CGSize to the SIMD3<Float> required in `PhysicsMotionComponent(linearVelocity, angularVelocity)`?
So if I drag an entity in RealityView I have to disable the PhysicsBodyComponent to make sure nothing fights dragging the entity around. This makes sense. When I finish a drag, this closure gets executed: .gesture( DragGesture() .targetedToAnyEntity() .onChanged { e in // ... } .onEnded { e in let velocity: CGSize = e.gestureValue.velocity } If I now re-add PhysicsBodyComponent to the component I just dragged, and I make it mode: .dynamic it will loose all velocity and drop straight down through gravity. Instead the solution is to apply mode: .kinematic and also apply a PhysicsMotionComponent component to the entity. This should retain velocity after letting go of the object. However, I need to instatiate it with PhysicsMotionComponent(linearVelocity: SIMD3<Float>, angularVelocity: SIMD3<Float>). How can I calculate the linearVelocity and angularVelocity when the e.gestureValue.velocity I get is just a CGSize? Is there another prop of gestureValue I should be looking at?
1
0
835
Jan ’25
How can I simultaneously apply the drag gesture to multiple entities?
I wanted to drag EntityA while also dragging EntityB independently. I've tried to separate them by entity but it only recognizes the latest drag gesture RealityView { content, attachments in ... } .gesture( DragGesture() .targetedToEntity(EntityA) .onChanged { value in ... } ) .gesture( DragGesture() .targetedToEntity(EntityB) .onChanged { value in ... } ) also tried using the simultaneously but didn't work too, maybe i'm missing something .gesture( DragGesture() .targetedToEntity(EntityA) .onChanged { value in ... } .simultaneously(with: DragGesture() .targetedToEntity(EntityB) .onChanged { value in ... } )
1
1
694
Mar ’25
RealityKit fails with EXC_BAD_ACCESS at CMClockGetAnchorTime in the simulator
Starting with iOS 18.0 beta 1, I've noticed that RealityKit frequently crashes in the simulator when an app launches and presents an ARView. I was able to create a small sample app with repro steps that demonstrates the issue, and I've submitted feedback: FB16144085 I've included a crash log with the feedback. If possible, I'd appreciate it if an Apple engineer could investigate and suggest a workaround. It's awkward to be restricted to the iOS 17 simulator, which does not exhibit this behavior. Please let me know if there's anything I can do to help. Thank you.
1
0
651
Apr ’25
Entity cross multiple portals at once?
If I have one portal on the ceiling and one on the floor, can a tall Entity cross multiple portals at once? Will the opposing portal directions cause it to fail? No matter what I try for the crossingMode and clippingMode of the PortalComponent I can only get it to fully work for one portal at a time. I have tried flipping the normals for the crossingMode and clippingMode planes. I have also tried creating a ceiling portal plane with inverted normals. It seems like whatever Entity is passing through a portal has one portal it wants to deal with at a time and that's it. My other option is to create portals using occlusion but I prefer the simplest way.
1
0
498
Jan ’25
Comparing colors of two ModelEntities
I want to compare the colors of two model entities (spheres). How can i do it? The method i'm currently trying to apply is as follows case let .color(controlColor) = controlMaterial.baseColor, controlColor == .green { // Flip target sphere colour if let targetMaterial = targetsphere.model?.materials.first as? SimpleMaterial, case let .color(targetColor) = targetMaterial.baseColor, targetColor == .blue { targetsphere.model?.materials = [SimpleMaterial(color: .green, isMetallic: false)] // Change to |1⟩ } else { targetsphere.model?.materials = [SimpleMaterial(color: .blue, isMetallic: false)] // Change to |0⟩ } } This method (baseColor) was deprecated in swift 15.0 changes to 'color' but i cannot compare the value color to each other.👾
1
0
652
Jan ’25
RealityKit Character Skeleton animation weapons are not animating
Hello, I am experiencing an issue with a character animation using RealityKit. I have a file created in Blender that contains the rigged Character, a sword, and a shield. The sword and the shield have bones connected to the character's hands so they can follow the character's animation. When I run the animation in Blender and preview the exported USDZ file on Mac, I can see the sword and shield attached to the hands and the animation is fine. But when I add the USDZ model in RealityKit and play the animation, only the character is animating, the sword and the shield are not moving at all. This is the code I use to animate the character: private func loadAnimations() { let unifiedAnimations = children[0].availableAnimations.first!.definition let animationResource = try! AnimationResource.generate(with: unifiedAnimations) self.children[0].playAnimation( animationResource.repeat() ) } Here is a link with a Demo Xcode project, the Blender file, and a video: https://www.dropbox.com/scl/fi/ypq2iwxc5f9dwzjggsvin/AppleTest.zip?rlkey=wiag3rg44urhjdh2wal8cdx2u&st=vbpf7x11&dl=0 STEPS TO REPRODUCE I have created a demo project that displays the character on a horizontal surface, and the animation starts playing. Run the App. The ARView will be set up, and a yellow square will appear in the middle of the screen. When a horizontal surface is detected the yellow Square will change indicating that the surface is found. Tap on the screen to load the USDZ model and position it in the yellow square's position. The Animation will start playing and you can see that the character is animating, but the sword and shield remain still. Thanks
1
0
541
Jan ’25
USDZ File Crashes in QuickLook on iPad 9th Gen (iPadOS 18.3) – Urgent Help Needed
Hi everyone, I’m experiencing a critical issue with USDZ files created in Reality Composer on an iPad 9th Generation (iPadOS 18.3). The files work perfectly on iPads from the 10th Generation onwards and on iPad Pros. However, on older devices like the iPad 9th Generation and older iPhones, QuickLook (file preview) crashes when opening them. This is a major issue because these USDZ files are part of an exhibition where artworks are extended with AR elements via a web page. If some visitors cannot view the 3D content, it significantly impacts the experience. What’s puzzling is that two years ago, we exported USDZ files from Reality Composer, made them available via a website, and they worked flawlessly on all devices, including older iPads and iPhones. Now, with the latest iPadOS, they consistently crash on older devices. Has anyone encountered a similar issue? Are there known limitations with QuickLook on older devices, or is there a way to optimize the USDZ files to prevent crashes? Could this be related to changes in iPadOS or RealityKit? Any advice or workaround would be greatly appreciated! Thanks in advance!
1
0
601
Feb ’25
Animating a RealityComposerPro shader's uniform input value
I'm trying to build a Shader in "Reality Composer Pro" that updates from a start time. Initially I tried the following: The idea was that when the startTime was 0, the output would be 0, but then I would set startTime from within code and this would be compared with the current GPU time, and difference used to drive another part of the shader graph: if let testEntity = root.findEntity(named: "Test"), var shaderGraphMaterial = testEntity.components[ModelComponent.self]?.materials.first as? ShaderGraphMaterial { let time = CFAbsoluteTimeGetCurrent() try! shaderGraphMaterial.setParameter(name: "StartTime", value: .float(Float(time))) testEntity.components[ModelComponent.self]?.materials[0] = shaderGraphMaterial } However, I haven't found a reference to the time the shader would be using. So now I am trying to write an EntityAction to achieve the same effect. Instead of comparing a start time to the GPU's time I'm trying to animate one of the shader's uniform input. However, I'm not sure how to specify the bind target. Here's my attempt so far: import RealityKit struct ShaderAction: EntityAction { let startValue: Float let targetValue: Float var animatedValueType: (any AnimatableData.Type)? { Float.self } static func registerEntityAction() { ShaderAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let value = simd_mix(event.action.startValue, event.action.targetValue, Float(animationState.normalizedTime)) animationState.storeAnimatedValue(value) } } } extension Entity { func updateShader(from startValue: Float, to targetValue: Float, duration: Double) { let fadeAction = ShaderAction(startValue: startValue, targetValue: targetValue) if let shaderAnimation = try? AnimationResource.makeActionAnimation(for: fadeAction, duration: duration, bindTarget: .material(0).customValue) { playAnimation(shaderAnimation) } } } ''' Currently when I run this I get an assertion failure: 'Index out of range (operator[]:line 797) index = 260, max = 8' Furthermore, even if it didn't crash I don't understand how to pass a binding to the custom shader value "startValue". Any clues of how to achieve this effect - even if it's a completely different way.
1
0
611
Feb ’25
ARMeshAnchor Data with RealityView
I want to use SwiftUI and RealityView to get AR scene understanding data (ARMeshAnchor) on iOS devices with LiDAR. The only way we can do that is by using ARSession (unless there is another way). However in previous iOS 18 builds there was this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:session:arconfiguration:) , which worked with SpatialTrackingSession and a custom ARSession together. This function in the the latest iOS and Xcode has since been removed in the RealityKit framework but still there on documentation. I also wanted to get ARFaceAnchor data which I still cannot get without ARSession, the closest I can get is by using: let target = AnchoringComponent.Target.face let anchoringComponent = AnchoringComponent(target, trackingMode: .predicted) entity = Entity() entity!.components.set(anchoringComponent) But I still can't find a way to get the current frame (ARFrame) or the anchors ([ARAnchor]) in the view. Alternatively if I use if I use this function: https://developer.apple.com/documentation/realitykit/spatialtrackingsession/run(_:) and start the ARSession separately. The session (didUpdate and didAdd) only runs for a few frames before getting interrupted. And if I completely remove SpatialTrackingConfiguration and just run the ARSession. There still is a valid tracked entity for the AnchoringComponent.Target.face component. IF in the configuration for the ARSession I use the ARWorldTrackingConfiguration with face tracking. And I still get updated facial data each frame. But the ARSession didUpdate or didAdd functions don't get called passed the first few frames. Interestingly if I switch the RealityViewCameraContent.RealityViewCamera to .virtual. I get ARMeshAnchor and ARFaceAnchor data, but no camera feed (as expected). This with or without SpatialTrackingConfiguration. My overarching question is what is the proper way to access ARMeshAnchors and other ARAnchors created by the system and track them live while also using SwiftUI. GitHub Repo with sample project can be found here: https://github.com/bpate75/RealityViewTesting
1
1
451
Feb ’25
alternative for CustomShader in visionOS
Following the post on https://developer.apple.com/documentation/realitykit/custommaterial it's simple to use shader for materials and get uniforms and params from each vertex. However it's not available for visionOS. Any alternative to use in this case? I want to write shader to fill material by myself. (I have shader experience from web, familiar with fragment shader)
1
0
456
Feb ’25
EntityAction for MaterialBaseTint - incorrect colours
Hello, I'm writing an EntityAction that animates a material base tint between two different colours. However, the colour that is being actually set differs in RGB values from that requested. For example, trying to set an end target of R0.5, G0.5, B0.5, results in a value of R0.735357, G0.735357, B0.735357. I can also see during the animation cycle that intermediate actual tint values are also incorrect, versus those being set. My understanding is the the values of material base colour are passed as a SIMD4. Therefore I have a couple of helper extensions to convert a UIColor into this format and mix between two colours. Note however, I don't think the issue is with this functions - even if their outputs are wrong, the final value of the base tint doesn't match the value being set. I wondered if this was a colour space issue? import simd import RealityKit import UIKit typealias Float4 = SIMD4<Float> extension Float4 { func mixedWith(_ value: Float4, by mix: Float) -> Float4 { Float4( simd_mix(x, value.x, mix), simd_mix(y, value.y, mix), simd_mix(z, value.z, mix), simd_mix(w, value.w, mix) ) } } extension UIColor { var float4: Float4 { var r: CGFloat = 0.0 var g: CGFloat = 0.0 var b: CGFloat = 0.0 var a: CGFloat = 0.0 getRed(&r, green: &g, blue: &b, alpha: &a) return Float4(Float(r), Float(g), Float(b), Float(a)) } } struct ColourAction: EntityAction { let startColour: SIMD4<Float> let targetColour: SIMD4<Float> var animatedValueType: (any AnimatableData.Type)? { SIMD4<Float>.self } init(startColour: UIColor, targetColour: UIColor) { self.startColour = startColour.float4 self.targetColour = targetColour.float4 } static func registerEntityAction() { ColourAction.subscribe(to: .updated) { event in guard let animationState = event.animationState else { return } let interpolatedColour = event.action.startColour.mixedWith(event.action.targetColour, by: Float(animationState.normalizedTime)) animationState.storeAnimatedValue(interpolatedColour) } } } extension Entity { func updateColour(from currentColour: UIColor, to targetColour: UIColor, duration: Double, endAction: @escaping (Entity) -> Void = { _ in }) { let colourAction = ColourAction(startColour: currentColour, targetColour: targetColour, endedAction: endAction) if let colourAnimation = try? AnimationResource.makeActionAnimation(for: colourAction, duration: duration, bindTarget: .material(0).baseColorTint) { playAnimation(colourAnimation) } } } The EntityAction can only be applied to an entity with a ModelComponent (because of the material), so it can be called like so: guard let modelComponent = entity.components[ModelComponent.self], let material = modelComponent.materials.first as? PhysicallyBasedMaterial else { return } let currentColour = material.baseColor.tint let targetColour = UIColor(_colorLiteralRed: 0.5, green: 0.5, blue: 0.5, alpha: 1.0) entity.updateColour(from:currentColour, to: targetColour, duration: 2)
1
0
571
Feb ’25
RealityKit particleEmitter delay starting when toggling isEmitting
I have a scene built up in RealityComposerPro, in which I've added a ParticleEmitter with isEmitting set to False and 'Loop' set to True. In my app, when I toggle isEmitting to True there can be a delay of a few seconds before the ParticleEmitter starts. However, if I programatically add the emitter in code at that point, it starts immediately. To be clear, I'm seeing this on the VisionOS simulator - I don't have access to a device at this time. Am I misunderstanding how to control the ParticleEmitter when I need precise control on when it starts.
1
0
547
Feb ’25
Gestures not working correctly when setting the fov orientation to .horizontal
Hi there, I've discovered an issue with gesture handling in RealityKit when setting the camera’s fieldOfViewOrientation to horizontal. For instance, if I render a simple box at the center of the view with a collision shape that exactly matches its dimensions, the actual hit area behaves as if it's smaller than the box. Additionally, when attempting to drag the box away from the center, the hit area appears misaligned—offset slightly towards the center. Since the default fieldOfViewOrientation is vertical and everything works as expected in that mode, it seems that the gesture system might be assuming a vertical FOV. Given that the API allows setting it to horizontal, perhaps gestures should function correctly regardless of the orientation? Thank you!
1
0
469
Mar ’25
VISION : getting real geometry reflections in reality kit ?
as in the environments we have real tiem reflections of movies on a screen or reflections of the surrounding hood in the background... could i get a metallic surface getting accurate reflections of a box on top ? i don't mean getting a rpobe or hdr cubemap, i mean the same accurate reflections as the water of the mt hood with movie i'm wacthing in other app
1
0
115
Mar ’25
How to configure RealityKit entities for animations on a modular character?
I am currently using RealityKit (perspective camera) to render a character in my swiftUI app. The character has customization such as clothing items and hair and all objects are properly weighted to the rig. The way the model is setup in Blender is like so: Groups of objects that will be swapped (ex: Shoes -> Shoes objects) and an armature. I then export it to usdc with all objects active. This is the resulting entity hierarchy, viewed in Reality Composer Pro: My problem is that when I export with the Armature Modifier applied to the objects, so that animations get exported, the ModelComponent gets flattened to the armature and swapping entities is no longer as simple as removing the entity with the corresponding name. What's the best practice here? Should animation be exported separately and then applied to the skeleton? If so, how is that achieved? I'm not really sure how to proceed here.
1
0
81
May ’25