Discuss spatial computing on Apple platforms and how to design and build an entirely new universe of apps and games for Apple Vision Pro.

All subtopics
Posts under Spatial Computing topic

Post

Replies

Boosts

Views

Activity

Imitating a grip on an object
I'm playing about with the hand tracking systems in reality kit / Vision Pro I thought it would be interesting if I could attach a virtual object to a hand when the hand is gripping (thought it would be fun to attach a basic cylinder to mimic a wand from Harry Potter) I'm able to detect when the user is gripping but having trouble placing an object as though it's within the hand. The simplest version of this is using an AnchorEntity pointing to the user's palm which kind of works, but quickly breaks the illusion when you rotate the wrist or hand. It seems as though I will have to roll my own anchor entity using the various points of the user's hand and I thought calculating some median point between the thumb and little finger tips would be a good start but it's proven a little difficult as we need both rotation and position. I'm already out of my depth with reality kit and matrices (and thanks to ChatGPT) I have some code, but as soon as I apply the position manually (as opposed to a hand anchor entity) it fails to render on the user's hand. It feels like this should already have been something someone has looked in to, any ideas on what might be the issue here? Note: HandTrackingSystem.handTracking is a HandTrackingProvider() guard let anchors = HandTrackingSystem.handTracking.latestAnchors.leftHand else { return } if let thumb = anchors.handSkeleton?.joint(.thumbTip), let little = anchors.handSkeleton?.joint(.littleFingerTip) { let thumbPos = simd_make_float3(thumb.anchorFromJointTransform.columns.3) let littlePos = simd_make_float3(little.anchorFromJointTransform.columns.3) let midPos = (thumbPos + littlePos) / 2 let direction = normalize(littlePos - thumbPos) let rotation = simd_quatf(from: [0, 1, 0], to: direction) wandEntity.transform.translation = midPos wandEntity.transform.rotation = rotation content.add(wandEntity) }
1
0
562
Jul ’25
Projecting a Cube with a Number in ARKit
I'm a novice in RealityKit and ARKit. I'm using ARKit in SwiftUI to show a cube with a number as shown below. import SwiftUI import RealityKit import ARKit struct ContentView : View { var body: some View { return ARViewContainer() } } #Preview { ContentView() } struct ARViewContainer: UIViewRepresentable { typealias UIViewType = ARView func makeUIView(context: UIViewRepresentableContext<ARViewContainer>) -> ARView { let arView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: true) arView.enableTapGesture() return arView } func updateUIView(_ uiView: ARView, context: UIViewRepresentableContext<ARViewContainer>) { } } extension ARView { func enableTapGesture() { let tapGestureRecognizer = UITapGestureRecognizer(target: self, action: #selector(handleTap(recognizer:))) self.addGestureRecognizer(tapGestureRecognizer) } @objc func handleTap(recognizer: UITapGestureRecognizer) { let tapLocation = recognizer.location(in: self) // print("Tap location: \(tapLocation)") guard let rayResult = self.ray(through: tapLocation) else { return } let results = self.raycast(from: tapLocation, allowing: .estimatedPlane, alignment: .any) if let firstResult = results.first { let position = simd_make_float3(firstResult.worldTransform.columns.3) placeObject(at: position) } } func placeObject(at position: SIMD3<Float>) { let mesh = MeshResource.generateBox(size: 0.3) let material = SimpleMaterial(color: UIColor.systemRed, roughness: 0.3, isMetallic: true) let modelEntity = ModelEntity(mesh: mesh, materials: [material]) var unlitMaterial = UnlitMaterial() if let textureResource = generateTextResource(text: "1", textColor: UIColor.white) { unlitMaterial.color = .init(tint: .white, texture: .init(textureResource)) modelEntity.model?.materials = [unlitMaterial] let id = UUID().uuidString modelEntity.name = id modelEntity.transform.scale = [0.3, 0.1, 0.3] modelEntity.generateCollisionShapes(recursive: true) let anchorEntity = AnchorEntity(world: position) anchorEntity.addChild(modelEntity) self.scene.addAnchor(anchorEntity) } } func generateTextResource(text: String, textColor: UIColor) -> TextureResource? { if let image = text.image(withAttributes: [NSAttributedString.Key.foregroundColor: textColor], size: CGSize(width: 18, height: 18)), let cgImage = image.cgImage { let textureResource = try? TextureResource(image: cgImage, options: TextureResource.CreateOptions.init(semantic: nil)) return textureResource } return nil } } I tap the floor and get a cube with '1' as shown below. The background color of the cube is black, I guess. Where does this color come from and how can I change it into, say, red? Thanks.
4
0
151
Jul ’25
Vision Pro - Throw object by hand
Hello All, I'm desperate to found a solution and I need your help please. I've create a simple cube in Vision OS. I can get it by hand (close my hand on it) and move it pretty where I want. But, I would like to throw it (exemple like a basket ball). Not push it, I want to have it in hand and throw it away of me with a velocity and direction = my hand move (and finger opened to release it). Please put me on the wait to do that. Cheers and thanks Mathis
9
0
1.6k
Feb ’25
RealityKit entity.write(to:) generates fatal protection error
My app for framing and arranging pictures from Photos on visionOS allows users to write the arrangements they create to .reality files using RealityKit entity.write(to:) that they then display to customers on their websites. This works perfectly on visionOS 2, but fails with a fatal protection error on visionOS 26 beta 1 and beta 2 when write(to:) attempts to write to its internal cache: 2025-06-29 14:03:04.688 Failed to write reality file Error Domain=RERealityFileWriterErrorDomain Code=10 "Could not create parent folders for file path /var/mobile/Containers/Data/Application/81E1DDC4-331F-425D-919B-3AB87390479A/Library/Caches/com.GeorgePurvis.Photography.FrameItVision/RealityFileBundleZippingTmp_A049685F-C9B2-479B-890D-CF43D13B60E9/41453BC9-26CB-46C5-ADBE-C0A50253EC27." UserInfo={NSLocalizedDescription=Could not create parent folders for file path /var/mobile/Containers/Data/Application/81E1DDC4-331F-425D-919B-3AB87390479A/Library/Caches/com.GeorgePurvis.Photography.FrameItVision/RealityFileBundleZippingTmp_A049685F-C9B2-479B-890D-CF43D13B60E9/41453BC9-26CB-46C5-ADBE-C0A50253EC27.} Has anyone else encountered this problem? Do you have a workaround? Have you filed a feedback? ChatGPT analysis of the error and my code reports: Why there is no workaround • entity.write(to:) is a black box — you cannot override where it builds its staging bundle • it always tries to create those random folders itself • you cannot supply a parent or working directory to RealityFileWriter • so if the system fails to create that folder, you cannot patch it 👉 This is why you see a fatal error with no recovery. See also feedbacks: FB18494954, FB18036627, FB18063766
10
0
406
Jul ’25
How to Disable Default Background Audio in RealityKit’s ObjectCaptureSession?
I am using RealityKit's ObjectCaptureSession API to capture objects, presenting the process with ObjectCaptureView. During the object capture session, there is default background audio that plays automatically. I noticed this same audio behavior in Apple's official Composer app, which seems to use the same API. I'd like to disable this audio in my app, but I have not been able to find any API or configuration option to do so. However, the audio persists, and I cannot find a way to turn it off. Is there an official method or workaround to disable this default audio in the ObjectCaptureSession API? Any guidance would be appreciated. Thank you!
1
0
693
Jan ’25
Person Occlusion on the Vision Pro
I am currently creating an app where two people share an instance of an immersive space so that they are able to point to certain things in the immersive space. Right now, other people are hidden behind the immersive space, and even with people awareness enabled for everything, people are still too difficult to see. I've found this documentation (https://developer.apple.com/documentation/arkit/occluding-virtual-content-with-people) which describes what I want to do, but it is only listed as working on iOS an iPadOS. Is there anything similar to this that will work on VisionOS?
2
0
132
Mar ’25
A question about adding grounding shadow in visionPro
I want adding grounding shadow on my Entity in RealityView on visionPro. However it seems that the shadow can only appear on another Entity. So I using plane detection in ARKit and add a transparent plane on it to render shadow. let planeEntity = ModelEntity(mesh: .generatePlane(width: anchor.geometry.extent.width, height: anchor.geometry.extent.height), materials: [material]) planeEntity.components.set(OpacityComponent(opacity: 0.0)) But sometimes there will be a border around my Entityon the plane. I do not know why it will happen, and I want remove the border.
5
0
411
Mar ’25
CameraFrameProvider distortion correction
Hi, I'm trying to correct the lens distortion in frames provided by Enterprise API camera frame provider. The frames provided seem to have only in/extrinsics info, but not the distortion lookup table. Is there some magic setting, or function to do that (I can't seem to find anything like this)? Or is there a way to use AVCameraCalibrationData together with provider?
2
0
407
Mar ’25
CompositorServices Or RealityKit
I have been concentrating on developing the visionOS application. While I am currently quite familiar with RealityKit, CompositorServices has also captured my attention. I have not yet acquired knowledge of CompositorServices. Could you please clarify whether it is essential for me to learn CompositorServices? Additionally, I would appreciate it if you could provide insights into the advantages of RealityKit and CompositorServices.
2
0
768
Mar ’25
Background Assets in VisionOS
Hi, I'm working on a VisionOS app and would like to integrate Background Assets to download large files after the app is installed. I'm wondering what would happen if the user takes off the headset while a background asset is being downloaded. Would it continue downloading or would the download be stopped/paused? I was looking for a way to download large assets while the user is not wearing the Vision Pro, is there any other alternative? Thanks in advance.
1
0
126
Jun ’25
Keyboard Tracking
Hi! I'm currently experimenting on Apple Vision Pro with hand and head anchors. Is there a way to get an anchor linked to the apple magic keyboard (as the detection is already done to display inputs at the top)? Thanks in advance, Have a good day!
1
0
646
Jul ’25
Creating spatial video with one camera
Hello everyone I would like to create my own spatial video on my Apple Vision Pro. According to all the documentation from Apple, this requires two camera angles that enhance the spatial perception. I have purchased the Enterprise license with main camera access for this purpose. However, this only gives me access to the left main camera of the glasses. Is there a way to access the right camera as well? Or is the one camera image enough to create a spatial video by splitting the image, for example? I am open to any help and ideas. My goal is to create the video with the cameras on the glasses, not externally.
1
0
295
Jun ’25
How to trigger other effects using hoverEffect?
I’m facing an issue while using CustomHoverEffect. In my view, there is a long title, which causes the title to be truncated. When the user hovers over it, the title should scroll. Although I have already implemented the scrolling effect, I am unsure how to trigger the scroll on hover. How should I approach this?
2
0
374
Feb ’25
How to play blend shape animations or morph animations exported from blender in Vision Pro apps using Reality Kit
So I am exporting a .usdc file from blender that already has some morph animations. The animations play well in blender but when I export I cannot seem to play them in RealityKit or RCP. Entity.availableAnimations is an empty array. Not of the child objects in the entity hierarchy has an animation library component with it. Maybe I am exporting it wrong but I tried multiple combinations but doesn't seem to work. Here are my export settings in blender The original file I purchased is an FBX file that has the animation but when I try to directly get it in RealityConverter it doesn't seem to play animations.
2
0
177
Jun ’25