When we are going to have a real analog clock option on the lock screen. If an apple watch can do it surely its not that difficult. For someone who uses a clock to tell what time it isn't as opposed to what time it is, i’m constantly having to convert a digital image where the image of hands on a dial is so much easier. Surely this isnt because someone has forgotten how to read a clock
Explore the art and science of app design. Discuss user interface (UI) design principles, user experience (UX) best practices, and share design resources and inspiration.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi everyone,
I’m testing our SwiftUI app on both Xcode simulator and a real iPhone. On the simulator, everything looks clean and aligned. But when I run it on an actual iPhone (same build, iOS 18), the layout looks broken—fonts overlap, spacing is off, and elements are misaligned.
Both screenshots are from the exact same screen and time. First is simulator, second is iPhone.
Any idea why this difference happens? Is there something I should check in terms of rendering or layout settings?
Thanks in advance!
[Iphone11]
Hi,
Does the iPad Playgrounds app act completely the same way as a MacBook Playground?
I am developing my app on a 2020 MacBook Air M1 using Swift Playgrounds. However, since the testing is going to be done on an iPad Swift Playgrounds, I was worried if my playground would work, since it relies heavily on the screen size etc.
My app runs completely perfect on MacBook Playrgounds, but doesn't work on the iPad simulator on Xcode.
Hi everyone 👋
I’ve been using the Apple Pencil Pro with my iPad Pro M4 and absolutely love it — the squeeze gesture, rotation, and haptics are amazing for creative work. But I’ve run into a little roadblock…
Right now, only one Pencil Pro can be paired at a time. So while one is charging, you can’t use another as a backup without unpairing and re-pairing, which interrupts the workflow.
I’d really love to see one of two things:
The ability to use one Pencil while another charges
or
An official external charger (or support for third-party ones)
Personally, I’d happily buy both a second Pencil and a charger if this became possible. I’ve even chatted with other creatives who feel the same — it would make a huge difference for long projects or working on the go.
Just wanted to share this idea and see if anyone else here would like this too. Thanks for reading!
Hi,
It would be sure if Apple creates a Fgima plug in to convert designs to SwiftUI, no one can do it like Apple and it would be SUPER and super time saving !
--
Kind Regards
The problem is the same in all of my applications. To reproduce it, in iOS 26, set the dark mode in the Brightness and display settings and in Accessibility, Display and text size, activate Increase contrast and bold text. With these settings, all the controls will be surrounded by a thin white line. When in the app a keyboard is dispayed, the thin white line does not appears correctly around the keyboard like in the capture joined, it is present on top and partially on bottom but not on sides
I've made the code in xcode for apple watch with 2 swift view (contentView.swift and interfaceController.swift).The swift for sound and haptic feedback is in InterfaceController.swift. But the the sound does not appear with haptic feedback in apple watch after complete the xcode.
the app is done but no sound appear with haptic feedback when rotate apple watch digital crown. when crown rotated but sound appear
code
import WatchKit
import AVFoundation
import WatchKit
class InterfaceController: WKInterfaceController {
// ... your UI elements
func playSelectionHapticAndSound() {
// Play a haptic feedback pattern
WKInterfaceDevice.current().play(.success)
// Load and play a selection sound effect
guard let soundURL = Bundle.main.url(forResource: "spin", withExtension: "wav") else { return }
do {
let player = try AVAudioPlayer(contentsOf: soundURL)
player.play()
} catch {
print("Error playing sound: \(error)")
}
}
}
Hi,
I was trying to use macOS-Sequoia-Production-Templates in Sketch format and when I try to export png icon file of the document template, it always includes grey nontransparent background which I am unable to delete. In contrast, exporting png app icon file from another template has transparent background and exports well.
Is it something wrong with the document icon production template? How can I export document png icon file with transparent background??
Thanks
Bonjour à tous, je voudrais savoir comment avance mon dossier sur les applications que j’ai créé,comment puis-je faire? Et sinon quelqu’un connaît-il la Durée exacte quand APPLE envoie le code de vérification pour mes applicationà!???
I've been playing around with iPad PRO M5 13" as part of my goal to implement some music relating SPH particle simulation effects on it - and this involves utilizing tap events also from the incredible looking fresh screen the device has.
See more information from here, all should be overreactively implemented but the ideas remain (with almost zero cost copy fragment shader) :
`https://youtu.be/ci-GSgQ0wlM`
This attached image shows the tap effects implementation brought just bit a little further than in the video.
Hello All,
I used to own an app named LOLIIPOP, and am in the process of transferring it to my new apple account.
I am having two problems....
How do I transfer the source code and binary to my new apple account?
My developers have an old code, so I need to send them the LAST code they uploaded to the App Store.
How do I do that as well???
Please any help!!!
Thanks,
Mr. LM
I'm trying to create custom SF symbols and am getting this error message when I validate the template. It doesn't matter if I Export Template or Symbol. Also, it doesn't even matter if I make any changes or not, as long as it is opened in Adobe Illustrator or Inkscape and then I save it, I will get this error message when validating.
i accidentally updated my iphone with the ios 18 and i dislike it. the emoji display were too huge and the picture gallery was kinda messy. i hope apple could fix this by bringing back the old emoji display and the gallery settings.
I have been battling the new Icon Composer app for 2 days trying to build an app icon. However, I cannot get it to import any files. I have used the Apple provided App Icon Template. I have exported my layers to .svg and when I open the finder, everything is disabled. I can't find help for this anywhere. I am on Sequoia on my Mac and not sure how to design this app icon without access to the composer.
I am creating a Generate note app but I don't see the text in the button when I applied the MeshGradient. I removed the Mesh Gradient and text is there. Need some help to find the issue. I am new to app Development and I am learning how to use it. Below is the code:
import SwiftUI
struct ContentView: View {
@State private var inputText: String = ""
@State private var isLoading: Bool = false
var body: some View {
ZStack {
Color(.systemGray6).edgesIgnoringSafeArea(.all)
VStack (spacing: 20) {
Text("Generate Notes")
.font(.title)
.fontWeight(.bold)
.frame(maxWidth: .infinity, alignment: .leading)
Text("Transform your thoughts into well-structured notes using artificial intelligence.")
.font(.subheadline)
.foregroundStyle(.secondary)
.frame(maxWidth: .infinity, alignment: .leading)
TextEditor(text: $inputText)
.frame(height: 200)
.padding()
.background(RoundedRectangle(cornerRadius: 16)
.fill(Color(.systemGray6)))
Button(action: {}) {
HStack {
if isLoading {
ProgressView()
.tint(.white)
} else {
Image(systemName: "sparkles")
}
Text(isLoading ? "Generating..." : "Generate Notes")
}
.padding()
.frame(maxWidth: .infinity)
.background(
MeshGradient(width: 3, height: 3, points: [
.init(0, 0), .init(0.5, 0), .init(1, 0),
.init(0, 0.5), .init(0.5, 0.5), .init(1, 0.5),
.init(0, 1), .init(0.5, 1), .init(1, 1)
], colors: [
.blue, .purple, .indigo,
.orange, .white, .blue,
.yellow, .green, .mint
])
)
.mask(
RoundedRectangle(cornerRadius: 16)
.stroke(lineWidth: 16)
.blur(radius: 8)
)
.overlay(
RoundedRectangle(cornerRadius: 16)
.stroke(.white, lineWidth: 1)
.blur(radius: 1)
.blendMode(.overlay)
)
.background(.black)
.foregroundColor(.white)
.cornerRadius(16)
.background(
RoundedRectangle(cornerRadius: 16)
.stroke(.black.opacity(0.5), lineWidth: 1)
)
.shadow(color: .black.opacity(0.15), radius: 20, x: 0, y: 20)
.shadow(color: .black.opacity(0.1), radius: 15, x: 0, y: 15)
}
.disabled(isLoading || inputText.isEmpty)
Spacer()
}
.padding(32)
.background(Color(.systemBackground))
.cornerRadius(44)
.shadow(color: .black.opacity(0.1), radius: 20, x:0, y:10)
}
}
}
#Preview {
ContentView()
}
When I create a tab group for the sidebar on iPad, the title and disclosure triangle act like a single control. Every time I tap the section title, the disclosure triangle for that section activates and hides or exposes that section's children and actions.
I want the section title to behave like Photos, where tapping a section title just displays its view controller, and the disclosure triangle is a separate control that must be tapped to hide and show children and actions.
I did not see any delegate methods that would let me control this behavior. Is this supported?
🔍 Context
The built-in screenshot editor in iOS and iPadOS (Markup tool) only allows users to crop images using rectangular frames. While this is sufficient for basic editing, it lacks flexibility for those who wish to tailor the screenshot to the aesthetics of iOS itself — which relies heavily on rounded shapes and smooth UI elements.
⸻
🚫 Current Limitation
• After taking a screenshot and opening it in Markup, users can only crop in rectangular or square formats.
• No option is available to apply rounded corners to the crop.
• As a result, many users are forced to use third-party apps just to achieve a basic rounded-edge crop, which feels unnecessary for such a common need.
⸻
✅ Proposed Solution
Add a rounded corner cropping feature to the screenshot editor.
This could be implemented as:
• A toggle to activate “Rounded Crop”.
• A radius slider (or predefined corner radius presets).
• Optional: an export option to save the result with transparent background, useful for designers and mockups.
⸻
🎯 Why it matters
• Aligns better with the iOS design language (cards, notifications, widgets, etc.).
• Saves time for users who currently have to rely on external editing apps.
• Greatly improves the presentation of screenshots for social media, UI/UX mockups, blogs, and professional use cases.
• Useful across many professions: developers, designers, content creators, educators, marketers.
⸻
📷 Visual Example
Here’s a mockup to illustrate the proposed feature:
(Add your image here)
⸻
💡 Bonus Suggestion
Allow exporting with a transparent background when cropping screenshots — especially useful for rounded crops or mockups placed on colored backgrounds.
I am developing an app that requires calling the iPhone's Face ID module to scan users' facial data. Where can I find Apple's design resources and guidelines for Face ID? The Face ID resources available in Figma are incomplete, and I need more support.
For example, in the iPhone settings, the scenario: the UI interface for scanning the user's face to collect data, specifically the circular design in the "How to Set Up Face ID" screen.
In the video ”Create Icons with Icon Composer”, the presenter mentions that Apple has created a layer-to-SVG script for Illustrator that‘s available for download:
Once the artwork is in a good place, next we want to export the layers as SVGs. For every tool, this can look a bit different. For those using Illustrator, we've created a layer to SVG script that will automate this for you, which you can download. Exporting out the canvas size ensures everything drops right into position in Icon Composer.
Here‘s the link to the mention:
https://developer.apple.com/videos/play/wwdc2025/361/?time=377
I can’t find any place to get this script, and my designer is very interested in using it to import our Illustrator icon into Icon Composer.
Can someone point me to it?