Hi,
Since iOS 26 introduced the new Games app, I’ve noticed a problem when using a Nintendo Switch Pro Controller in wired USB-C mode, and also with third-party controllers that emulate it (like the GameSir X5 Lite).
In the Games app interface, only the L/R buttons respond, but the D-Pad and analog sticks don’t work at all. Once inside actual games, the controller works fine — the issue only affects the Games app UI.
What I’ve tested so far:
Xbox / PlayStation controllers → work fine in both wired and Bluetooth, including inside the Games app.
Switch Pro Controller (Bluetooth) → works fine, including in the Games app.
Switch Pro Controller (wired) → same issue as the X5 Lite, D-Pad and sticks don’t work in the Games app.
This makes it hard to use the new Games app launcher with these controllers, even though they work perfectly once a game is launched.
My question: is this an iOS bug (Apple needs to add proper support for wired Switch Pro controllers in the Games app), or something that Nintendo / GameSir would need to address?
Thanks in advance to anyone who can confirm this or provide more info.
Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi, following the recent deprecation of SceneKit, I'm trying to move a couple of my SceneKit projects to RealityKit.
One thing I can't seem to find is how to change the content scale factor when using a RealityView in SwiftUI. It was really easy to do in SceneKit with just a SCNView property, and it seems that it's also possible when using ARView, but I can't find a way to do it with a RealityView. Maybe it's a SwiftUI limitation?
Topic:
Graphics & Games
SubTopic:
RealityKit
Breaking Through PolySpatial's ~8k Object Limit – Seeking Alternative Approaches for Large-Scale Digital Twins
Confirmed: PolySpatial make Doubles MeshFilter Count – Hard Limit at ~8k Active Objects (15.9k Total)
Project Context & Research Goals
I’m developing an industrial digital twin application for Apple Vision Pro using Unity’s PolySpatial framework (RealityKit rendering in Unbounded_Volume mode). The scene contains complex factory environments with:
Production line equipment Many fragmented grid objects need to be merged.)
Dynamic product racks (state-switchable assets)
Animated worker avatars
To optimize performance, I’m systematically testing visionOS’s rendering capacity limits. Through controlled stress tests, I’ve identified a critical threshold:
Key Finding
When the total MeshFilter count reaches 15,970 (system baseline + 7,985 user-created objects × 2 due to PolySpatial cloning), the application crashes consistently. This suggests:
PolySpatial’s mirroring mechanism effectively doubles GameObject overhead
An apparent hard limit exists around ~8k active mesh objects in practice
Objectives for This Discussion
Verify if others have encountered similar limits with PolySpatial/RealityKit
Understand whether this is a:
Memory constraint (per-app allocation)
Render pipeline limit (Metal draw calls)
Unity-specific PolySpatial behavior
Explore optimization strategies beyond brute-force object reduction
Why This Matters
Industrial metaverse applications require rendering thousands of interactive objects . Confirming these limits will help our team:
Design safer content guidelines
Prioritize GPU instancing/LOD investments
Potentially contribute back to PolySpatial’s optimization
I’d appreciate insights from engineers who’ve:
Pushed similar large-scale scenes in visionOS
Worked around PolySpatial’s cloning overhead
Discovered alternative capacity limits (vertices/draw calls)
In the CanyonCrosser example project, some RealityKit systems are implemented as classes while others are structs. What’s the reason for using different types?
Issues building Unity plug-in project: Cannot locate native library Apple.Core/Apple.GameKit for iOS
I'm having issues getting a well built package from the Apple Unity Plug-in project.
When building the my game project in Unity the following error is printed to the console:
Apple.Core.AppleNativeLibraryUtility] Cannot locate a Debug or Release Apple.Core native library for iOS.
Please ensure that the build invocation (build.py, xcodebuild, or Xcode) compiled cleanly and that the build was configured to support Debug on iOS.
As far as I can tell the build did compile cleanly, but I might be missing something.
If anyone can see what I'm doing wrong or has any insight it would be greatly appreciated.
Setup is the following:
macOS Tahoe 26 Beta
Xcode-beta Version 26.0 beta 3 (17A5276g)
Unity Plug-in branch: 2025-beta1
Unity game project version: 2022.3.60f
M1 Macbook Pro
The built packages have been imported into the game project through the Unity Package Manager using the tarball option pointing to the built packages from the Unity Plug-in project.
The Unity Plug-in project has been built using the build.py file with the following:
python3 build.py -m iOS iPhoneSimulator -p Core GameKit CoreHaptics GameController -k all
The output is available in the attached file.
build-output.txt
Here's an image of the NativeLibraries~ folder inside the built Apple.Core package.
I've been playing with the new GameSave API and cannot get it to work.
I followed the 3-step instructions from the Developer video. Step 2, "Next, login to your Apple developer account and include this entitlement in the provisioning profile for your game." seems to be unnecessary, as Xcode set this for you when you do step 1 "First add the iCloud entitlement to your game."
Running the app on my device and tapping "Load" starts the sync, then fails with the error "Couldn’t communicate with a helper application." I have no idea how to troubleshoot this. Every other time I've used CloudKit it has Just Worked™.
Halp‽
Here is my example app:
import Foundation
import SwiftUI
import GameSave
@main struct GameSaveTestApp: App {
var body: some Scene {
WindowGroup {
GameView()
}
}
}
struct GameView: View {
@State private var loader = GameLoader()
var body: some View {
List {
Button("Load") { loader.load() }
Button("Finish sync") { Task { try? await loader.finish() } }
}
}
}
@Observable class GameLoader {
var directory: GameSaveSyncedDirectory?
func stateChanged() {
let newState = withObservationTracking {
directory?.state
} onChange: {
Task { @MainActor [weak self] in self?.stateChanged() }
}
print("State changed to \(newState?.description ?? "nil")")
switch newState {
case .error(let error):
print("ERROR: \(error.localizedDescription)")
default: _ = 0 // NOOP
}
}
func load() {
print("Opening gamesave directory")
directory = GameSaveSyncedDirectory.openDirectory()
stateChanged()
}
func finish() async throws {
print("finishing syncing")
await directory?.finishSyncing()
}
}
Topic:
Graphics & Games
SubTopic:
General
I have a visionOS app that I’m adding support for IOS and will like to keep using RealityView.
I know there are the following modifiers to add some navigation
.realityViewCameraControls(.orbit)
.realityViewCameraControls(.dolly)
.realityViewCameraControls(.pan)
But how can I add more than one? For example I would like to orbit with one finger, Pan with 2 fingers and dolly by pinching. Is this possible and if so can someone share some sample code on how to achieve that?
Thanks,
Guillermo
I'm updating our app to support metal 4, but the metal 4 types don't seem to get recognized when targeting simulator. Is it known if metal 4 will be supported in the near future, or am I setting up the app wrong?
Hello, I am trying to capture screen recording ( output.mp4 ) using ScreenCaptureKit and also the mouse positions during the recording ( mouse.json ). The recording and the mouse positions ( tracked based on mouse movements events only ) needs to be perfectly synced in order to add effects in post editing.
I started off by using the await stream?.startCapture() and after that starting my mouse tracking function :-
try await captureEngine.startCapture(configuration: config, filter: filter, recordingOutput: recordingOutput)
let captureStartTime = Date()
mouseTracker?.startTracking(with: captureStartTime)
But every time I tested, there is a clear inconsistency in sync between the recorded video and the recorded mouse positions.
The only thing I want is to know when exactly does the recording "actually" started so that I can start the mouse capture at that same time, and thus I tried using the Delegates, but being able to set them up perfectly.
import Foundation
import AVFAudio
import ScreenCaptureKit
import OSLog
import Combine
class CaptureEngine: NSObject, @unchecked Sendable {
private let logger = Logger()
private(set) var stream: SCStream?
private var streamOutput: CaptureEngineStreamOutput?
private var recordingOutput: SCRecordingOutput?
private let videoSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.VideoSampleBufferQueue")
private let audioSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.AudioSampleBufferQueue")
private let micSampleBufferQueue = DispatchQueue(label: "com.francestudio.phia.MicSampleBufferQueue")
func startCapture(configuration: SCStreamConfiguration, filter: SCContentFilter, recordingOutput: SCRecordingOutput) async throws {
// Create the stream output delegate.
let streamOutput = CaptureEngineStreamOutput()
self.streamOutput = streamOutput
do {
stream = SCStream(filter: filter, configuration: configuration, delegate: streamOutput)
try stream?.addStreamOutput(streamOutput, type: .screen, sampleHandlerQueue: videoSampleBufferQueue)
try stream?.addStreamOutput(streamOutput, type: .audio, sampleHandlerQueue: audioSampleBufferQueue)
try stream?.addStreamOutput(streamOutput, type: .microphone, sampleHandlerQueue: micSampleBufferQueue)
self.recordingOutput = recordingOutput
recordingOutput.delegate = self
try stream?.addRecordingOutput(recordingOutput)
try await stream?.startCapture()
} catch {
logger.error("Failed to start capture: \(error.localizedDescription)")
throw error
}
}
func stopCapture() async throws {
do {
try await stream?.stopCapture()
} catch {
logger.error("Failed to stop capture: \(error.localizedDescription)")
throw error
}
}
func update(configuration: SCStreamConfiguration, filter: SCContentFilter) async {
do {
try await stream?.updateConfiguration(configuration)
try await stream?.updateContentFilter(filter)
} catch {
logger.error("Failed to update the stream session: \(String(describing: error))")
}
}
func stopRecordingOutputForStream(_ recordingOutput: SCRecordingOutput) throws {
try self.stream?.removeRecordingOutput(recordingOutput)
}
}
// MARK: - SCRecordingOutputDelegate
extension CaptureEngine: SCRecordingOutputDelegate {
func recordingOutputDidStartRecording(_ recordingOutput: SCRecordingOutput) {
let startTime = Date()
logger.info("Recording output did start recording \(startTime)")
}
func recordingOutputDidFinishRecording(_ recordingOutput: SCRecordingOutput) {
logger.info("Recording output did finish recording")
}
func recordingOutput(_ recordingOutput: SCRecordingOutput, didFailWithError error: any Error) {
logger.error("Recording output failed with error: \(error.localizedDescription)")
}
}
private class CaptureEngineStreamOutput: NSObject, SCStreamOutput, SCStreamDelegate {
private let logger = Logger()
override init() {
super.init()
}
func stream(_ stream: SCStream, didOutputSampleBuffer sampleBuffer: CMSampleBuffer, of outputType: SCStreamOutputType) {
guard sampleBuffer.isValid else { return }
switch outputType {
case .screen:
break
case .audio:
break
case .microphone:
break
@unknown default:
logger.error("Encountered unknown stream output type:")
}
}
func stream(_ stream: SCStream, didStopWithError error: Error) {
logger.error("Stream stopped with error: \(error.localizedDescription)")
}
}
I am getting error
Value of type 'SCRecordingOutput' has no member 'delegate'
Even though I am targeting macOs 15+ ( macOs 26 actually ) and macOs only.
What is the best way to achieving the desired result? Is there any other / better way to do it?
, it is after update to Xcode 14.3:
[default] CGSWindowShmemCreateWithPort failed on port 0
Hello -
Upon enabling debug mode for GameKit configuration for my iOS app, I do not see any devices (my iPhone or simulators) in the device list within the "Manage Game Progress" debug tool. I only see my Mac as a device. Is there any trick to this or a way to add devices to this list?
Thank you
Just wondering if anyone knows what it will take to hit greater than 60hz when targeting iPhone. If I set the preferredFramesPerSecond of an MTKView to 120, it works on the iPad, but on iPhone it never goes over 60hz, even with a simple hello triangle sample app... is this a limitation of targeting iPhone?
Topic:
Graphics & Games
SubTopic:
Metal
Dear Apple, there is still no support for WebGPU via WebView. Do you have a roadmap on when you plan to support it? That would be very helpful to release our new game.
Topic:
Graphics & Games
SubTopic:
General
I can't create any breakpoint in my Xcode after I upgraded to macOS 15.4
macOS: Version 15.4 (24E248)
visionOS Simulator: 2.3
Xcode: Version 16.2 (16C5032a)
My app works well without any breakpoints.
But if I create any breakpoint it shows me this:
Couldn't find the Objective-C runtime library in loaded images.
Message from debugger: The LLDB RPC server has crashed. You may need to manually terminate your process. The crash log is located in ~/Library/Logs/DiagnosticReports and has a prefix 'lldb-rpc-server'. Please file a bug and attach the most recent crash log.
Hi,
I've just moved my SpriteKit-based game from UIView to SwiftUI + SpriteView and I'm getting this mesage
Adding 'GCControllerView' as a subview of UIHostingController.view is not supported and may result in a broken view hierarchy. Add your view above UIHostingController.view in a common superview or insert it into your SwiftUI content in a UIViewRepresentable instead.
Here's how I'm doing this
struct ContentView: View {
@State var alreadyStarted = false
let initialScene = GKScene(fileNamed: "StartScene")!.rootNode as! SKScene
var body: some View {
ZStack {
SpriteView(scene: initialScene, transition: .crossFade(withDuration: 1), isPaused: false , preferredFramesPerSecond: 60)
.edgesIgnoringSafeArea(.all)
.onAppear {
if !self.alreadyStarted {
self.alreadyStarted.toggle()
initialScene.scaleMode = .aspectFit
}
}
VirtualControllerView()
.onAppear {
let virtualController = BTTSUtilities.shared.makeVirtualController()
BTTSSharedData.shared.virtualGameController = virtualController
BTTSSharedData.shared.virtualGameController?.connect()
}
.onDisappear {
BTTSSharedData.shared.virtualGameController?.disconnect()
}
}
}
}
struct VirtualControllerView: UIViewRepresentable {
func makeUIView(context: Context) -> UIView {
let result = PassthroughView()
return result
}
func updateUIView(_ uiView: UIView, context: Context) {
}
}
class PassthroughView: UIView {
override func hitTest(_ point: CGPoint, with event: UIEvent?) -> UIView? {
for subview in subviews.reversed() {
let convertedPoint = convert(point, to: subview)
if let hitView = subview.hitTest(convertedPoint, with: event) {
return hitView
}
}
return nil
}
}
I am rewriting an unfinished SceneKit project as RealityKit (NonAR). As far as I can see, RealityKit is missing basic fog functionality?
Fog was simple & easy to implement in SCeneKit (fogStartDistance / fogEndDistance / fogDensityExponent / fogColor). Are there any plans to implement something like this in RealityKit?
Are there any simple workarounds?
Topic:
Graphics & Games
SubTopic:
RealityKit
I am currently working on a game that involves earning achievements, which I am using the Apple Unity Plug-Ins to display. I have found that occasionally opening the Game Center Dashboard the last achievement earned will not be displayed until the game is closed and reopened. I am using GKAccessPoint.Shared.Trigger to display the Achievements screen, which occasionally seems to open a cached version of the dashboard. I've found that it seems to consistently happen when earning multiple achievements within one minute, but this is not always the case. Does anybody have any experience with something like this in the past?
Added achievements to my approved app. Added them for the next release version, which I am running in simulator. When I look at the Achievements page, I can see that there are 17 Achievements available (correct), but they all show as hidden, despite checking the "No" box in App Store Connect.
Topic:
Graphics & Games
SubTopic:
GameKit
Has anyone come across the issue that setting GKLocalPlayer.local.authenticateHandler breaks a RealityView's world tracking on iOS / iPadOS 18 beta 5?
I'm in the process of upgrading my app to make use of the much appreciated RealityView unification, using RealityView not only on visionOS but now also on iOS and iPadOS. In my RealityView, I enable world tracking on iOS like this:
content.camera = .worldTracking
However, device position and orientation were ignored (the camera remained static) and there was no camera pass-through. Then I discovered that the issue disappeared when I remove the line
GKLocalPlayer.local.authenticateHandler = { viewController, error in
// ... some more code ...
}
So I filed FB14731139 and hope that it will be resolved before the release of iOS / iPadOS 18.
I use unity 2020.3.48f1 to develop a game; trying to implement Apple Services integration I use Apple unity plugins(https://github.com/apple/unityplugins) Using latest version of unity plugins I getting error in Unity project after plugin import It say "Not allowed platform VisionOS" When I tryed to use older version of the plugins I getting error on runtime when calling "var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems();" in line 42 it drop EXC_BAD_ACCESS(code=257, address=0x0000...) error I tryed to use different commits from official repositorys and even custom branches of apple unity plugins like (https://github.com/muZZkat/unityplugins/tree/muzzkat/fix-fetch-items) but it did not help
There is whole my script which trying to use apple unuity plugins
using System.Threading.Tasks;
using UnityEngine;
using System.Collections;
using System;
using Apple.GameKit;
using UnityEngine.UI;
public class TheScript : MonoBehaviour
{
[SerializeField]
InputField otp;
string Signature;
string TeamPlayerID;
string Salt;
string PublicKeyUrl;
string Timestamp;
void Start()
{
StartCoroutine(Call());
}
private IEnumerator Call()
{
yield return new WaitForSeconds(5);
Login();
}
public async Task Login()
{
otp.text += $"Loginig... ";
if (!Apple.GameKit.GKLocalPlayer.Local.IsAuthenticated)
{
try
{
var player = await GKLocalPlayer.Authenticate();
var localPlayer = GKLocalPlayer.Local;
TeamPlayerID = localPlayer.TeamPlayerId;
var fetchItemsResponse = await GKLocalPlayer.Local.FetchItems();
Signature = Convert.ToBase64String(fetchItemsResponse.GetSignature());
PublicKeyUrl = fetchItemsResponse.PublicKeyUrl;
otp.text += $"Team Player ID: {TeamPlayerID} ";
otp.text += $"PublicKeyUrl: {PublicKeyUrl} ";
}
catch(Exception e)
{
otp.text += $"Error: " + e.Message;
}
}
else
{
Debug.Log("AppleGameCenter player already logged in.");
}
}
async Task SignInWithAppleGameCenterAsync(string signature, string teamPlayerId, string publicKeyURL, string salt, ulong timestamp)
{
}
}