Build, test, and submit your app using Xcode, Apple's integrated development environment.

Xcode Documentation

Posts under Xcode subtopic

Post

Replies

Boosts

Views

Activity

A Summary of the WWDC25 Group Lab - Developer Tools
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for Developer Tools. Will my project codebase be used for training when I use Xcode's intelligent assistant powered by cloud-based models? When using ChatGPT without logging in, your data will not be used to improve any models. If you log in to a ChatGPT account, this is based on your ChatGPT account settings, which allows you to opt-out (it defaults to on). When using Xcode with accounts for other model providers, you should check with the policies of your provider. And finally, at no point will any portion of your codebase be used to train or improve any Apple models. We'd love to make our SwiftUI Previews (and soon, Playgrounds) as snappy as possible. Is there any way to skip certain build steps, such as running linters? It seems the build environment is exactly the same (compared to a debug build), but maybe there's a trick. Starting with Xcode 16, SwiftUI previews use the exact same build artifacts as the regular build. The new Playgrounds support in Xcode 26 uses these build artifacts too. Shell script build phases are the most common thing that introduces extra build time, so as a first step, try turning off all shell script build phases (like linters) to get an idea if that’s the issue. If those build phases add significant time to your build, consider moving some of those phases into asynchronous steps, such as running linters before committing instead of on every build. If you do need a shell script build phase to run during your build, make sure to explicitly define the input and output files, as that is a huge way to improve your build performance. Are we able to provide additional context for the models, like coding standards? Documentation for third party dependencies? Documentation on your own codebase that explains things like architecture and more? In general, Xcode will automatically search for the right context based on the question and the evolving answer, as the model can interact multiple times with your project as it develops an answer. This will automatically pick up the coding style of the code it sees, and can include files that contain architecture comments, etc. Beyond automatic context, you can manually attach other documents, even if they aren't in your project. For example, you could make a file with rules and ideas and attach it, and it will influence the response. We are very aware of other kinds of automatic context like rule files, etc, though Xcode does not support these at this time. Once ChatGPT is enabled for Coding Intelligence in Xcode 26, and I sign into my existing ChatGPT account, will the ChatGPT Coding Intelligence model in Xcode know about chat conversations on Xcode development done previously in the ChatGPT Mac app? Xcode does not use information from other conversations, and conversations started in Xcode are not accessible in the web UI or ChatGPT app. Is there a plan to make SwiftUI views easier to locate and understand in the view hierarchy like UIKit views? SwiftUI uses a declarative paradigm to define your user interface. That allows you to specify what you want, with the system translating that into an efficient representation at runtime. Unlike traditional AppKit and UIKit, seeing the runtime representation of SwiftUI views isn't sufficient in order to understand why it's not doing what you want. This year, we introduced a SwiftUI Instrument that shows why things are happening, like view re-rendering. Is it possible to use the AI chat with ChatGPT Enterprise? My company doesn't allow us to use the general ChatGPT, only the enterprise version they have setup that prevents data from being leaked Yes, Xcode 26 supports logging into any existing ChatGPT account, including enterprise accounts. If that does not meet your needs, you can also setup a local server that implements the popular chat completions REST API to talk to your enterprise account how you need. Now that Icon Composer is here, how does it complement or replace existing vector design tools such as Sketch for icon design? Icon Composer complements your existing vector design tools. You should continue to create your shapes, gradients, and layers in another tool like Sketch, and compose the exported SVG layers in Icon Composer. Once you bring your layers into Icon Composer, you can then use it to influence the translucency, blur, and specular highlights for your icon. What’s one feature or improvement in the new Xcode that you personally think developers will love, but might not immediately discover? Maybe something tucked away or quietly powerful that’s flown under the radar so far? One feature we're particularly excited about is the new power profiler for iOS, which gives you further insights into the energy consumption of your app beyond what was possible with the energy instrument previously. You can learn more about how to use this instrument and how it can help you greatly reduce your apps battery usage in the documentation, as well as the session Profile and optimize power usage in your app. There were also improvements in accessibility this year with Voice Control, where you can naturally speak your Swift code to Xcode, and it understands the Swift syntax as you speak. To see it in action, take a look at the demonstration in What’s new in Xcode 26. We have a software advisory council that is very sensitive to having our private information going to the cloud in any form. What information do you have to help me guide Xcode and Apple Intelligence through the acceptance process? One thing you can do is configure a proxy for your enterprise that implementing the popular Chat Completions API endpoint protocol. When using a model provider via URL, you can use your proxy endpoint to inspect the network traffic for anything that you do not want sent outside of your enterprise, and then forward the traffic through the proxy to your chosen model provider. Are there list of recommended LLMs to use with Xcode via Intelligence/Local? I've tried Gemma3-12B, but.. I hope there are better options? Apple doesn't have a published list of recommended local models. This is a fast-moving space, and so a recommendation would become out of date very quickly as new models are released. We encourage you to try out the local model support in Xcode 26 with models that you find meet your needs, and let us and the community know! (continued below)
1
0
857
Jul ’25
Swift 6 concurrency error of passing sending closure
I am getting this error in a couple of places in my code with Task closure after setting Swift 6 as Language version in XCode. Passing closure as a 'sending' parameter risks causing data races between code in the current task and concurrent execution of the closure Below is minimally reproducible sample code. import Foundation final class Recorder { var writer = Writer() func startRecording() { Task { await writer.startRecording() print("started recording") } } func stopRecording() { Task { await writer.stopRecording() print("stopped recording") } } } actor Writer { var isRecording = false func startRecording() { isRecording = true } func stopRecording() { isRecording = false } } While making the class Recorder as Actor would fix the problem, I fear I will have to make too many classes as Actors in the class and in my scenario, there could be performance implications where real time audio and video frames are being processed. Further, I don't see any race condition in the code above. Does the error talk about possible race conditions that could arise if I were to call other functions on the self in future, or the current code itself is not safe?
0
2
1.2k
Jan ’25
Fetch Changes not working Xcode 16.2
I have cloned a private repo from GitHub and for the most part it seems to be working fine. e.g. commit, push and pull are working. However, Fetch Changes doesn't seem to work at all. It pops up a brief message as if it is working, but the status of my local branch under the Source Control Navigator -> Repositories does not get updated. Even after a push, the status still tells me I am ahead on the local branch. If I add a new file on the GitHub website, Fetch Changes will not see it and the repository status remains unchanged. All of these changes are immediately visible in 'Fork' (3rd party Git GUI). If I restart Xcode the status is updated correctly, but it does not keep up with further changes. Anything I can do other than avoid using Xcode for GitHub tasks? Update: This might just be a problem with the Source Control Navigator views not updating. I just found that if I change branch, the "Repositories" section will not reflect the branch change until I exit and re-enter that view. e.g. click on "Changes" and back into "Repositores" to see the update. That workaround does not impact the out of date repo status though (how many commits am I ahead or behind). Seems very buggy...
0
2
334
Jan ’25
devicectl copy directory recursively
I can use devicectl to copy single files from a device fine using: xcrun devicectl --verbose device copy from --user mobile --domain-identifier <BUNDLE_ID> --domain-type appDataContainer --device <DEVICE_ID> --source Documents/Screenshots/0001.png --destination Screeshots/0001.png but when there are many files this can get pretty slow. Is there a way of recursively copying an entire directory? I've tried this: xcrun devicectl --verbose device copy from --user mobile --domain-identifier <BUNDLE_ID> --domain-type appDataContainer --device <DEVICE_ID> --source Documents/Screenshots --destination Screeshots but nothing is transferred and it times out eventually with the error: ERROR: The specified file could not be transferred. (com.apple.dt.CoreDeviceError error 7000 (0x1B58)) ERROR: The operation couldn’t be completed. The file service client failed to read data from the network socket because we timed out when waiting for data to become available. (NSPOSIXErrorDomain error 60 (0x3C)) NSLocalizedFailureReason = The file service client failed to read data from the network socket because we timed out when waiting for data to become available. Xcode itself can do it with the "download container..." option, but I'd like to be able to automate it.
0
1
218
Mar ’25
XCFrameworks deploying .a / include to output target directory
Hello Friends, We have a strange bug that Xcode is deploying static binaries from within .XCFramework into the CONFIGURATION_BUILD_DIR An example of our xcframeworks structure is: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd"> <plist version="1.0"> <dict> <key>AvailableLibraries</key> <array> <dict> <key>BinaryPath</key> <string>libglfw3.a</string> <key>HeadersPath</key> <string>Headers</string> <key>LibraryIdentifier</key> <string>macos-arm64_x86_64</string> <key>LibraryPath</key> <string>libglfw3.a</string> <key>SupportedArchitectures</key> <array> <string>arm64</string> <string>x86_64</string> </array> <key>SupportedPlatform</key> <string>macos</string> </dict> </array> <key>CFBundlePackageType</key> <string>XFWK</string> <key>XCFrameworkFormatVersion</key> <string>1.0</string> </dict> </plist> for open source creative coding toolkit https://github.com/openframeworks/openFrameworks Can you see any issues in the above. Some ideas for me is to generate xcarchive instead of deploying the .a in the xcframeworks, however that does not explain the include being packaged there, so I think this might be a Xcode issue
0
0
171
Mar ’25
XCode 16.2 forgets my git credentials constantly
Like the title says, I'll commit some changes no problem with my Git name and email displayed properly, then work for a while longer and when use the integrate menu to stage and commit I find the git name/email are empty. My credentials are properly entered in settings. The only fix i've found is quitting and restarting. Any less frustrating option?
0
0
173
Feb ’25
How to Export OBJ with Texture (JPG + MTL) from ARKit LiDAR Scan in iOS?
I am using ARKit with RealityKit to scan objects using LiDAR on iOS. I can generate an OBJ file from ARMeshAnchors, but I am missing the texture export (JPG + MTL). What I Have So Far: Successfully capturing mesh using ARMeshAnchor. Converting mesh into MDLAsset and exporting .obj. I need help generating the .jpg texture and linking it to the .mtl file. private func exportScannedObject() { guard let camera = arView.session.currentFrame?.camera else { return } func convertToAsset(meshAnchors: [ARMeshAnchor]) -> MDLAsset? { guard let device = MTLCreateSystemDefaultDevice() else {return nil} let asset = MDLAsset() for anchor in meshAnchors { let mdlMesh = anchor.geometry.toMDLMesh(device: device, camera: camera, modelMatrix: anchor.transform) // Apply a gray material to the mesh let material = MDLMaterial(name: "GrayMaterial", scatteringFunction: MDLScatteringFunction()) material.setProperty(MDLMaterialProperty(name: "baseColor", semantic: .baseColor, float3: SIMD3(0.5, 0.5, 0.5))) // Gray color if let submeshes = mdlMesh.submeshes as? [MDLSubmesh] { for submesh in submeshes { submesh.material = material } } asset.add(mdlMesh) } return asset } func export(asset: MDLAsset) throws -> URL { let directory = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first! let url = directory.appendingPathComponent("scaned.obj") if MDLAsset.canExportFileExtension("obj") { do { try asset.export(to: url) return url } catch let error { fatalError(error.localizedDescription) } } else { fatalError("Can't export USD") } } if let meshAnchors = arView.session.currentFrame?.anchors.compactMap({ $0 as? ARMeshAnchor }), let asset = convertToAsset(meshAnchors: meshAnchors) { do { let url = try export(asset: asset) showScanPreview(url) } catch { print("export error") } } } extension ARMeshGeometry { func vertex(at index: UInt32) -> SIMD3<Float> { assert(vertices.format == MTLVertexFormat.float3, "Expected three floats (twelve bytes) per vertex.") let vertexPointer = vertices.buffer.contents().advanced(by: vertices.offset + (vertices.stride * Int(index))) let vertex = vertexPointer.assumingMemoryBound(to: SIMD3<Float>.self).pointee return vertex } // helps from StackOverflow: // https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar func toMDLMesh(device: MTLDevice, camera: ARCamera, modelMatrix: simd_float4x4) -> MDLMesh { func convertVertexLocalToWorld() { let verticesPointer = vertices.buffer.contents() for vertexIndex in 0..<vertices.count { let vertex = self.vertex(at: UInt32(vertexIndex)) var vertexLocalTransform = matrix_identity_float4x4 vertexLocalTransform.columns.3 = SIMD4<Float>(x: vertex.x, y: vertex.y, z: vertex.z, w: 1) let vertexWorldPosition = (modelMatrix * vertexLocalTransform).columns.3 let vertexOffset = vertices.offset + vertices.stride * vertexIndex let componentStride = vertices.stride / 3 verticesPointer.storeBytes(of: vertexWorldPosition.x, toByteOffset: vertexOffset, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.y, toByteOffset: vertexOffset + componentStride, as: Float.self) verticesPointer.storeBytes(of: vertexWorldPosition.z, toByteOffset: vertexOffset + (2 * componentStride), as: Float.self) } } convertVertexLocalToWorld() let allocator = MTKMeshBufferAllocator(device: device); let data = Data.init(bytes: vertices.buffer.contents(), count: vertices.stride * vertices.count); let vertexBuffer = allocator.newBuffer(with: data, type: .vertex); let indexData = Data.init(bytes: faces.buffer.contents(), count: faces.bytesPerIndex * faces.count * faces.indexCountPerPrimitive); let indexBuffer = allocator.newBuffer(with: indexData, type: .index); let submesh = MDLSubmesh(indexBuffer: indexBuffer, indexCount: faces.count * faces.indexCountPerPrimitive, indexType: .uInt32, geometryType: .triangles, material: nil); let vertexDescriptor = MDLVertexDescriptor(); vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0); vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: vertices.stride); let mesh = MDLMesh(vertexBuffer: vertexBuffer, vertexCount: vertices.count, descriptor: vertexDescriptor, submeshes: [submesh]) return mesh } } What I Need Help With: How do I generate the JPG texture from the AR scene? How do I save an MTL file linking the OBJ model to the texture? How can I correctly apply the texture when viewing the OBJ in an external 3D viewer? I appreciate any guidance, including sample code or resources! If you have a complete working solution, I’d love to discuss further via private channels.
0
0
455
Feb ’25
C++ Binary Size Increase in Xcode 16 Compared to Xcode 15
I've recently upgraded my project from Xcode 15 to Xcode 16. Without changing any build settings or compiler flags, I noticed that the final executable size has increased significantly when building the same C++ code. Investigation: To investigate further, I compared the Link Map outputs from both versions of Xcode. One key difference I found: The symbol size for std::sort increased from 4,336 bytes (Xcode 15) to 6,084 bytes (Xcode 16) – a ~40% increase. This seems to be part of a broader trend where other standard library symbols are also taking up more space in the binary when built with Xcode 16. Questions: Has Apple Clang or libc++ in Xcode 16 changed the implementation of STL algorithms like std::sort? Are there changes in inlining, template instantiation, or debug info that could explain the increase? Is this expected behavior? If so, is there any guidance on minimizing the binary size regression?
0
0
133
Apr ’25
Unable to apply stashed changes
I am trying to perform an unstash operation. I RC on the stash to restore, select "Apply Stashed Changes". It then launches a comparison window to allow me to cherrypick the changes I want to include. However, the button along the bottom, "Apply Changes" is grayed out and remains so no matter what I do. Switching stashes, exiting and restarting xcode. TG i made an emergency project backup right before I did the stash, just for a case like this. Bug? Feature?
0
1
50
Apr ’25
Option to download SDK Seperatly.
Hey Guys , I think forcing people to download Simulator along side SDK will increase hude download size, is not a good experience for developers. Make simulator download optional and you can keep it ticked by default inside (Settings/Components/IOS XX.X Support) for beginner developers , but for advanced developer there should be option to untick it and reduce huge download size. If it doesn't result in huge download size,then we have took a wrong turn somewhere.
0
1
235
Feb ’25
Preview broken
Hello, I am doing the SwiftUI tutorial with Xcode 16.2 For the watch app the simulator working fine but the preview keep saying “This app cannot run on the selected target device.” while the preview is working …… For the macOS part the simulator working fine but the preview is totally broken with -> ”ContentView.swift” not found in any targets. While it is correctly added to the target … Please fix or make tutorial clear.
0
0
156
Feb ’25
Xcode 16.1 hang after adding Store Kit configuration
I don't know Xcode very well, and had been using it sporadically to test a project exported from Unity. I recently added a StoreKit Configuration to the project, and since then it gives me ≈10s after launching before the beach ball appears, and it never leaves. Xcode says it is "Indexing" but I don't know if that's an obvious culprit. I can still build my project via the fastlane tool, on the command line, and the newly-added StoreKit configuration works just fine. But I cannot use Xcode interactively any more, unless I re-export the project from Unity. The only difference between it working and hanging is the Store Kit configuration. How do I understand what's going on, and (ideally) fix it? The Activity Monitor gives a bunch of backtraces that don't have an obvious network or I/O culprit in them.
0
0
75
Apr ’25
Issue using RealmSwift in a framework versus using it in an app
If I install RealmSwift into an ios app using SPM, I have no problems. However if I install it into an ios framework, then when building there is this error: error: missing required modules: 'Realm.Private', 'Realm', 'Realm.Swift' I have discovered that if I change the import statement from: import RealmSwift to private import RealmSwift then doing that makes this build error goes away (but doing that isn't a feasible workaround as I would like to publicly export classes stored in Realm from the framework). There's no point in posting issues with Realm on their support board as its being deprecated and its tumble weeds on their forum. But I would be very interested in hearing from the Apple expects explanation or speculation on: why is importing the same framework via SPM into a framework xcode project resulting in different behavior then when importing it into an app? why would changing import to private import make the build error go away? TIA
0
0
91
Mar ’25