The Build for Previews builds all my targets including the test targets. Is there a way to configure the relevant targets? I do not see an option in the schema editor, and disabling Find Implicit Dependencies has no effect either.
Explore the various UI frameworks available for building app interfaces. Discuss the use cases for different frameworks, share best practices, and get help with specific framework-related questions.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I am using SwiftUI to create an app and I have figured out how to present a scene for my preferences window. However I have yet to find a way to modify the "About "My App"" scene. I am not even sure how to ask the question on other forums because I keep getting informations on application menus.
I would like to find information on accessing/changing other menu entries in the menubar (in SwiftUI) an most specifically I would like to find out how to present a custom window (or at least custom information) when the user selects "About "My App""
I guess I don't need a solution but a pointer to documentation that will help me in my quest.
I have an App that builds for iOS, iPadOS, macOS and Apple TV, which was last released to all the App Stores in April. Preferences/settings are handled by the App itself except for the Apple TV variant, where I use a Settings bundle. This worked fine until tvOS 15.0, where it appears that tvOS is not updating the value of the App’s settings from NSUserDefaults when the Settings App opens. I have been working on this problem off and on for the last week and am at wits end.
I’ve searched WWDC videos looking for a clue, there must be some simple change I cannot see. I’ve made clean projects for iOS and tvOS, and using the identical OBJ-C code and Settings plist entries, the iOS version works perfectly, the tvOS version fails in the simulator and on the device. I am not trying to synchronize Settings across devices, just persist across restarts on a single device.
My code stores data correctly in NSUserDefaults, it simply seems that tvOS Settings App is not reading values from there for display, nor writing changes that the user makes from Settings back to user defaults. None of the types in the test projects work: TexField, Switch, Title.
The test code is so simple I hesitate to include it, but the code and the NSUserDefaults key identifiers do match. This code will preset my App’s version number for Settings to display in iOS 15 but not tvOS 15. It used to work in tvOS 14:
<key>DefaultValue</key>
<string>DefaultVersionValue</string>
<key>Type</key>
<string>PSTitleValueSpecifier</string>
<key>Title</key>
<string>Version</string>
<key>Key</key>
<string>VersionKey</string>
</dict>
``` NSUserDefaults *ud = [NSUserDefaults standardUserDefaults];
[ud registerDefaults:@{
@"TextFieldKey" : @"TextFieldValue",
@"VersionKey" : @"VersionValue"
}];
[ud setObject:@"3.14" forKey:@"VersionKey"];
Any idea? Many thanks.
I'm trying to understand how to use .focusedSceneValue on macOS.
Given a very basic app which displays Thing-s, and have a menu for editing the things. When I run the app, nothing is selected at first and the menu is disabled. When selecting e.g. the Thing Alfa in the sidebar. the menu becomes enabled as expected. When I select another Thing, the menu is also updated as expected.
However, if I switch focus to another application, e.g. the Finder, and then switch back to my app, the menu is now disabled, even though a Thing is selected in the sidebar.
If I open another window within my app and select e.g. Gamma in the sidebar of that window the menu is updated as expected. But, when switching back to the first window the menu is disabled, although a Thing is selected.
What am I doing wrong? Xcode 13.1 and macOS Monterey 12.0.1.
See the code below (the code can also be found here: https://github.com/danwaltin/FocusedSceneValueTest)
struct Thing: Identifiable, Hashable {
let id: Int
let name: String
static func things() -> [Thing] {
return [
Thing(id: 1, name: "Alfa"),
Thing(id: 2, name: "Beta"),
Thing(id: 3, name: "Gamma")
]
}
}
@main
struct FocusedSceneValueTestApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
.commands {
ThingCommands()
}
}
}
struct ContentView: View {
var body: some View {
NavigationView {
List(Thing.things()) { thing in
NavigationLink(
destination: DetailView(thing: thing),
label: {Text(thing.name)}
)
}
Text("Nothing selected")
}
}
}
struct DetailView: View {
let thing: Thing
var body: some View {
Text(thing.name)
.focusedSceneValue(\.selectedThing, thing)
.navigationTitle(thing.name)
}
}
struct ThingCommands: Commands {
@FocusedValue(\.selectedThing) private var thing
var body: some Commands {
CommandMenu("Things") {
Button("Edit \(thingName)") {
print("*** Editing \(thingName)")
}
.disabled(thing == nil)
.keyboardShortcut("e")
}
}
private var thingName: String {
guard let thing = thing else {
return ""
}
return thing.name
}
}
struct SelectedThingKey : FocusedValueKey {
typealias Value = Thing
}
extension FocusedValues {
var selectedThing: Thing? {
get {self[SelectedThingKey.self]}
set {self[SelectedThingKey.self] = newValue}
}
}
First off, i believe i do know what the option 'Preserve Vector Data' does. But just in case, let me recap: When using PDFs or SVGs in my Asset Catalog, Xcode by default creates assets at fixed sizes from these files at buildtime. When i tick the 'Preserve Vector Data' box, the asset is scaled at runtime using the vector data, allowing for smooth scaling and crisp images at any scale.
But the question i'm asking myself now is what exactly is the drawback - most likely performance-wise - to simply activating this option for each an every SVG or PDF Asset i use in my project?
I would be very happy if someone could elaborate on this or direct me to some more in-depth documentation on Vector Assets :)
Hi,
I have an existing AppKit-based Mac app that I have been working on for a few years. For a new feature, I wanted to have the app opened by a different app, so I setup the URL scheme under CFBundleURLTypes in my Info.plist, and adopted this delegate callback:
- (void)application: (NSApplication *)application openURLs:(nonnull NSArray<NSURL *> *)urls
Now when I invoke the URL from the 2nd app, it opens my app correctly, BUT this delegate method isn't called. What's interesting is that if I make a totally new app with a URL scheme and adopt this delegate method, it gets called without a problem!
SO what about my original project could be responsible for this 'opensURLs' method to not be called? I've been searching for a solution for a couple of days without any luck. The macOS app's target has a Deployment Target of 10.15 and I'm running this on macOS12.0 with Xcode 13.
I'm using UIDocumentPickerViewController to import document to my app from OneDrive and I want to show the OneDrive folder every time I use UIDocumentPickerViewController instead of the last folder I opened. Is it possible? Can I use pickerController.directoryURL ? And how to get folder URL of OneDrive?
class ViewController: UIViewController, DocumentDelegate {
var picker: DocumentPicker?
override func viewDidLoad() {
super.viewDidLoad()
picker = DocumentPicker(presentationController: self, delegate: self)
}
@IBAction func create_picker(_ sender: Any) {
picker?.presentDocumentPicker()
}
func didPickImage(image: UIImage?) {}
}
protocol DocumentDelegate: AnyObject {
func didPickImage(image: UIImage?)
}
class DocumentPicker: NSObject {
private var pickerController: UIDocumentPickerViewController?
private weak var presentationController: UIViewController?
private weak var delegate: DocumentDelegate?
init(presentationController: UIViewController,
delegate: DocumentDelegate) {
super.init()
self.presentationController = presentationController
self.delegate = delegate
}
func presentDocumentPicker() {
pickerController = UIDocumentPickerViewController(forOpeningContentTypes: [.image])
if let pickerController = pickerController {
pickerController.delegate = self
pickerController.allowsMultipleSelection = false
presentationController?.present(pickerController, animated: true)
}
}
}
extension DocumentPicker: UIDocumentPickerDelegate {
func documentPicker(_ controller: UIDocumentPickerViewController, didPickDocumentsAt urls: [URL]) {
guard let url = urls.first else { return }
print(url)
}
}
Hi,
I'm currently developing a SwiftUI based app with Core Data and CloudKit sync with the NSPersistentCloudKitContainer.
I found different solutions how to toggle CloudKit sync of the Core Data during runtime. The basic idea of these solutions is the following.
instantiate a new NSPersistentCloudKitContainer
set storeDescription.cloudKitContainerOptions = nil
load persistence store
Some solutions recommend to restart the app manually to avoid exactly my problem.
Issues
So far so good. How can I distribute the new viewContext through my app during runtime. In the main App I distributed the viewContext during startup via @Environment(\.managedObjectContext) and it seems not be updated automatically after a reinitialization of NSPersistentCloudKitContainer.
var body: some Scene {
WindowGroup {
ContentView()
.environment(\.managedObjectContext, persistence.container.viewContext)
}
}
After deactivating the CloudKit sync I receive the following error when I try to add a new entity.
[error] warning: Multiple NSEntityDescriptions claim the NSManagedObject subclass 'TestEntity' so +entity is unable to disambiguate.
Any ideas?
Regards
Sven
I'm working on an app targeting iOS 15+ using SwiftUI.
The app has several Views that load data from an API in their onAppear() method. While the loading operation is in progress, these views show a loading overlay via .fullScreenCover().
While most of the time this works as expected, I've discovered that if the API operation completes before the overlay's .onAppear() has fired, the overlay gets stuck on screen, i.e. does not dismiss. This bug occurs both in the simulator and on device.
This is a simplified version of my implementation:
struct MyDataView: View {
@EnvironmentObject var store:Store
var Content: some View {
// ...
}
@ViewBuilder
var body: some View {
let showLoadingOverlay = Binding(
get: {
store.state.loading
},
set: { _ in }
)
Content
.onAppear {
store.dispatch(LoadData)
}
.fullScreenCover(isPresented: showLoadingOverlay) {
LoadingOverlay()
}
}
}
Log messages tell me that my store is updating correctly, i.e. the booleans all operate as expected. Adding log output to the binding's getter always prints the correct value. Adding a breakpoint to the binding's getter makes the problem disappear.
I've found that the chronology of events that lead to this bug is:
MyDataView.onAppear()
LoadData
Binding: true
Overlay starts animating in
LoadData finishes
Binding: false
Overlay fires it's onAppear
I.e. whenever loading finishes before the fullScreenCover's onAppear is fired, the overlay get's stuck on screen. As long as loading takes at least as long as it takes the overlay to appear, the bug does not occur.
It appears to be a race condition between the .fullScreenCover appearing and the binding changing to false.
I've found that the bug can be avoided if loading is triggered in the overlay's .onAppear(). However, I would like to avoid this workaround because the overlay is not supposed to carry out data loading tasks.
Hi,
I want to activate/deactivate the CloudKit Sync during App runtime in a user settings view. Basically this works fine. Every time I toggle between the NSPersistentContainer and the NSPersistentCloudKitContainer, I increase the persistence.persistenceContainerReloaded attribute and the whole view hierarchy will be reloaded. Thus all changes are passed through the whole app.
During the reload phase I have to load a new persistence store by container.loadPersistentStores(...). Unfortunately, I cannot remove the old persistence store before loading the new one. The app crashes immediately, because the store and viewContext is still in use. Therefore, I just create a new one and trigger the reload. Afterwards every view is using the new viewContext. But somewhere in the background there is still the old persistence store with CloudKit Sync active and pushes every local change to the cloud. Changes on the cloud from other devices are not received anymore.
Does someone has any idea, how to correctly unload a PersistentStore (replace NSPersistentCloudKitContainer by NSPersistentContainer) in a SwiftUI based app?
@main
struct TargetShooterApp: App {
@StateObject var persistence: Persistence = Persistence.shared
var body: some Scene {
WindowGroup {
ContentView()
.environment(\.managedObjectContext, persistence.container.viewContext)
.id(persistence.persistenceContainerReloaded)
}
}
}
Hey,
I know you can write @AppStorage("username") var username: String = "Anonymous" to access a Value, stored In the User Defaults, and you can also overwrite him by changing the value of username.
I was wondering if there is any workaround to use @AppStorage with Arrays.
Because I don't find anything, but I have a lot of situations where I would use it.
Thanks!
Max
I am hitting major road blocks in migrating one of my Obj-C-Cocoa applications away from -[NSView (un)lockFocus] and -[NSBitmapImageRep initWithFocusedViewRect:].
In a transcript of a presentation on WWDC2018 I read:
With our changes to layer backing, there's a few patterns I want to call out that aren't going to work in macOS 10.14 anymore. If you're using NSView lockFocus and unlockFocus, or trying to access the window's graphics contents directly, there's a better way of doing that. You should just subclass NSView and implement draw rect. ...
Of course, we all implemented -[NSView drawRect:] for decades now. The big question is, how can we do incremental (additional, event driven) drawing in our views, without redrawing the whole view hierarchy. This is the use case of -(un)lockFocus, and especially when drawing of the base view is computational expensive. Wo would have thought that people use -(un)lockFocus for regular drawing of the NSView hierarchy.
I tried to get away with CALayer, only to find out after two days experimenting with it, that a sublayer can only be drawn if the (expensive) main layer has been drawn before —> dead end road.
Now I am going to implement a context dependent -[NSView drawRect:]. Based on a respective instance variable, either of the (expensive) base presentation of the view or the simple additions are drawn. Is it that what Apple meant by … just subclass NSView and implement draw rect?
From the point of view of object oriented programming, using switch() in methods to change the behaviour of the object is ugly - to say the least. Any better options?
Ugly or not, in any case, I don’t want to redraw the whole view hierarchy only for moving a crosshairs in a diagram.
My actual use case is:
This application draws into a custom diagram NSView electrochemical measurement curves which may consist of a few thousands up to millions of data points. The diagram view provides a facility for moving crosshairs and other pointing aids over the displayed curves, by dragging/rolling with the mouse or the touch pad, or by moving it point by point with the cursor keys.
Diagram generation is computational expensive and it must not occur only because the crosshairs should be moved to the next data point.
So for navigating the crosshairs (and other pointing aids), a respective method locks the focus of said view, restores the background from a cache, caches the background below the new position of the crosshairs using -[NSBitmapImageRep initWithFocusedViewRect:], draws the crosshairs and finally unlocks the focus.
All this does not work anymore since 10.14.
Don’t over-engineering! No suggested architecture for SwiftUI, just MVC without the C.
On SwiftUI you get extra (or wrong) work and complexity for no benefits. Don’t fight the system.
When i have TextField inside ScrollView and tap on it the keyboard is shown as expected. But it seems that the TextField is moved up just enough to show the input area but i want to be moved enough so that is visible in its whole. Otherwise it looks cropped. I couldn't find a way to change this behaviour.
struct ContentView: View {
@State var text:String = ""
var body: some View {
ScrollView {
VStack(spacing: 10) {
ForEach(1...12, id: \.self) {
Text("\($0)…")
.frame(height:50)
}
TextField("Label..", text: self.$text)
.padding(10)
.background(.white)
.cornerRadius(10)
.overlay(
RoundedRectangle(cornerRadius: 10)
.stroke(.blue, lineWidth: 1)
)
}
.padding()
.background(.red)
}
}
}
I have a simple SwiftUI application with CoreData and two views. One view displays all "Place" objects. You can create new places and you can show the details for the place.
Inside the second view you can add "PlaceItem"s to a place.
The problem is that, once a new "PlaceItem" is added to the viewContext, the @NSFetchRequest seems to forget about its additional predicates, which I set in onAppear. Then every place item is shown inside the details view. Once I update the predicate manually (the refresh button), only the items from the selected place are visible again.
Any idea how this can be fixed? Here's the code for my two views:
struct PlaceView: View {
@FetchRequest(sortDescriptors: []) private var places: FetchedResults<Place>
@Environment(\.managedObjectContext) private var viewContext
var body: some View {
NavigationView {
List(places) { place in
NavigationLink {
PlaceItemsView(place: place)
} label: {
Text(place.name ?? "")
}
}
}
.toolbar {
ToolbarItem(placement: .primaryAction) {
Button {
let place = Place(context: viewContext)
place.name = NSUUID().uuidString
try! viewContext.save()
} label: {
Label("Add", systemImage: "plus")
}
}
}
.navigationTitle("Places")
}
}
struct PlaceItemsView: View {
@ObservedObject var place: Place
@FetchRequest(sortDescriptors: []) private var items: FetchedResults<PlaceItem>
@Environment(\.managedObjectContext) private var viewContext
func updatePredicate() {
items.nsPredicate = NSPredicate(format: "place == %@", place)
}
var body: some View {
NavigationView {
List(items) { item in
Text(item.name ?? "");
}
}
.onAppear(perform: updatePredicate)
.toolbar {
ToolbarItem(placement: .primaryAction) {
Button {
let item = PlaceItem(context: viewContext)
item.place = place
item.name = NSUUID().uuidString
try! viewContext.save()
} label: {
Label("Add", systemImage: "plus")
}
}
ToolbarItem(placement: .navigationBarLeading) {
Button(action: updatePredicate) {
Label("Refresh", systemImage: "arrow.clockwise")
}
}
}
.navigationTitle(place.name ?? "")
}
}
struct ContentView: View {
@Environment(\.managedObjectContext) private var viewContext
var body: some View {
NavigationView {
PlaceView()
}
}
}
Thanks!
Hi, I'm embedding the QLPreviewController in a UIViewControllerRepresentable. When I view .usdz models I don't see the AR/Object selector at the top, nor the sharing button. I have tried presenting modally with a .sheet modifier and had the same result. What do I need to do to get the controls? Thanks, code attached.
Code
Spiff
Hi, I'm trying to make a weather menu bar app, and I want to have it so that the icon of the app in the menu changes with the actual weather, but the icon isn't showing up. There is still a space in the menu bar where I can click and open the app, it's just that the icon has disappeared. Any ideas to fix it?
Hi,
In a Mac Catalyst app, I need to allow the user insert a passcode using a UITextField.
The field is used to insert a one time passcode and I want to keep the content hidden. For this reason I set the isSecureTextEntry property to true.
passcodeTextField.isSecureTextEntry = true
By doing this, a button to allow the user to pick a password from the keychain is displayed:
This option in my case should not appear because the password is a one time password that change every time. For that reason I set the textContentType to oneTimeCode.
passcodeTextField.textContentType = .oneTimeCode
This actually removes the password button, but introduce something weird. If the user type something and then delete everything, a big empty box appear under the field:
I have no idea what this box is and why it appears.
Does anyone know why it appears and how I can remove it?
Thank you
Hi,
I've implemented the FamilyActivityPicker and I also noticed that it is the same picker that we get when we go to Setttings > Screen Time > App Limits > Add Limit.
When you tap on a given row, it will present all apps that are in that category. If you press the check mark on Category then on the Category row will be updated by showing the checkmark as selected and a "All" text to the right side, and on the app rows from that category the check marks will also be marked as selected.
This behavior is not consistent when I implement the FamilyActivityPicker. If I go through the same process the app rows won't be shown as selected.
Any suggestions on how to make this work? I'm attaching screen shots to illustrate my point.
Settings App
My FamilyActivityPicker Implementation
I have a NSRulerView with a vertical orientation. It works fine from macOS 10.13 to 11.x.
In macOS Monterey (12.2.1 here), the ruler view is not receiving drawHashMarksAndLabelsInRect: messages when the associated NSTextView is scrolled vertically.
When the parent NSScrollView is resized, the ruler view is correctly refreshed on all macOS versions.
[Q] Is it a known bug in macOS Monterey?