Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

How to reset system window private picker alert with Screen Capture Kit
Hi, I would like to reset system window private picker alert with ScreenCapture kit. i can reset the ScreenCapture permission with tccutil reset ScreenCapture. but it does not reset the system window private picker alert. i tried deleting the application directory from container and it does not help. the system window private picker alert uses the old approval i gave and it does not prompt a new alert. How can i starta with fresh screencapture kit settings for an app in testing? Thanks
0
0
69
Jun ’25
Live Caller ID Lookup Implementation
Hello I'm working on Live Caller ID Lookup implementation on my own pet-project, as I understood I need to create app and extension for this app. I also created test PIR-service. I did configure serviceURL, tokenIssuerURL and userTierToken. In My app I implemented following code Task { if LiveCallerIDLookupManager.shared.status(forExtensionWithIdentifier: "some-extension") == .disabled { `//` Show an alert. print("LiveCallerIDLookupManager is disabled") } do { // Open Settings. try await LiveCallerIDLookupManager.shared.openSettings() } catch { } It does open Call settings, but I don't understand what should I do next.
0
0
69
Jun ’25
How to provide a driving destination to CarPlay, like Calendar
If I have, say a doctor appointment in the Calendar app, and I'm leaving to go to it, the address will appear in Apple Maps on CarPlay. Forgive if I'm getting the details wrong, but I believe if I bring up the Map, it will be available to tap on, so I can quickly go there. I think it may also show up on one on the car-play screens that shows a few different panels. The point is, I really like this feature, and want to do it in my app. In my iOS app, the user can order food from a restaurant, and pick it up. I'm not ready to make this app a "quick service" app, but I want to give the user an easy to get to her location. Since I just ordered food, this means that I'll need to leave fairly quickly to go to the location. The Calendar app is able to offer a location because of scheduling, I'd like to do the same.
0
0
83
Jun ’25
Presenter Overlay Not Showing When Recording a Single Window or Region with ScreenCaptureKit
Hi, I'm using ScreenCaptureKit on macOS 14+ to record a single window. I've noticed that the Presenter Overlay only appears when capturing the entire screen, but it does not appear when recording a specific window or a region. Is there a way to enable the Presenter Overlay while recording a single window or a defined region, similar to how it works with full-screen capture? Any guidance or clarification would be greatly appreciated. Thanks in advance!
0
0
101
May ’25
Filtered dataFrame displays original dataFrame values
I'm importing a csv file of 299 rows and creating a subset by filtering on one column value (24 rows). I want to Chart just the filtered values. However, when I print one column I get values from the original dataFrame. Any suggestions? Thanks, David The code: import SwiftUI import Charts import TabularData struct DataPoint: Identifiable { var id = UUID() // This makes it conform to Identifiable var date: Date var value: Double } struct ContentView: View { @State private var dataPoints: [DataPoint] = [] var body: some View { Text("Hello") Chart { ForEach(dataPoints) { dataPoint in PointMark( x: .value("Date", dataPoint.date), y: .value("Value", dataPoint.value) ) } } .frame(height: 300) .padding() .onAppear(perform: loadData) } func loadData() { print("In Loading Data") // Load the CSV file if let url = Bundle.main.url(forResource: "observations", withExtension: "csv") { do { let options = CSVReadingOptions(hasHeaderRow: true, delimiter: ",") var data0 = try DataFrame(contentsOfCSVFile: url, options: options) let formattingOptions = FormattingOptions( maximumLineWidth: 200, maximumCellWidth: 15, maximumRowCount: 30 ) // print(data0.description(options: formattingOptions)) print("Number of Columns: \(data0.columns.count)") let columnsSet = ["plant_id", "date", "plot", "plantNumber", "plantCount"] data0 = try DataFrame(contentsOfCSVFile:url, columns: columnsSet, options: options) print("Number of Columns (after columnsSet): \(data0.columns.count)") print("Printing data0") print(data0.description(options: formattingOptions)) let data = data0.filter { $0["plant_id"] as? Int == 15 } print("Printing data") print(data.description(options: formattingOptions)) print(" Number of Rows \(data.rows.count)") for i in 0 ... data.rows.count { // print("\(i): \(data["plantCount"][i]!)") if let plantCount = data["plantCount"][i] as? Int { print("\(i): \(plantCount)") } else { print("\(i): Value not found or invalid type") } } // // var newDataPoints: [DataPoint] = [] // Here I plan to add the filtered data to DataPoint // DispatchQueue.main.async { dataPoints = newDataPoints } } catch { print("Error reading CSV file: \(error)") } } else { print("Didn't load csv file") } } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ContentView() } } Here is the new dataFrame and print output Printing data ┏━━━━━┳━━━━━━━━━━┳━━━━━━━━━━━━┳━━━━━━━┳━━━━━━━━━━━━━┳━━━━━━━━━━━━┓ ┃ ┃ plant_id ┃ date ┃ plot ┃ plantNumber ┃ plantCount ┃ ┃ ┃ <Int> ┃ <String> ┃ <Int> ┃ <Int> ┃ <Int> ┃ ┡━━━━━╇━━━━━━━━━━╇━━━━━━━━━━━━╇━━━━━━━╇━━━━━━━━━━━━━╇━━━━━━━━━━━━┩ │ 0 │ 15 │ 2023-09-07 │ 1 │ 5 │ 5 │ │ 32 │ 15 │ 2023-09-07 │ 2 │ 10 │ 10 │ │ 38 │ 15 │ 2023-09-07 │ 2 │ 20 │ 20 │ │ 66 │ 15 │ 2023-09-07 │ 4 │ 25 │ 25 │ │ 77 │ 15 │ 2023-09-07 │ 5 │ 5 │ 5 │ │ 99 │ 15 │ 2023-09-14 │ 7 │ 45 │ 45 │ │ 142 │ 15 │ 2024-05-30 │ 1 │ 20 │ 20 │ │ 162 │ 15 │ 2024-05-30 │ 4 │ 5 │ 5 │ │ 169 │ 15 │ 2024-05-30 │ 5 │ 10 │ 10 │ │ 175 │ 15 │ 2024-05-30 │ 7 │ 10 │ 10 │ │ 188 │ 15 │ 2024-07-11 │ 1 │ 20 │ 40 │ │ 199 │ 15 │ 2024-07-11 │ 2 │ 5 │ 5 │ │ 215 │ 15 │ 2024-07-11 │ 5 │ 20 │ 30 │ │ 220 │ 15 │ 2024-07-11 │ 7 │ 30 │ 40 │ │ 236 │ 15 │ 2024-09-06 │ 1 │ 20 │ 60 │ │ 238 │ 15 │ 2024-09-06 │ 2 │ 30 │ 35 │ │ 248 │ 15 │ 2024-09-06 │ 5 │ 5 │ 35 │ │ 254 │ 15 │ 2024-09-06 │ 7 │ 50 │ 90 │ │ 267 │ 15 │ 2025-05-04 │ 1 │ 10 │ 70 │ │ 273 │ 15 │ 2025-05-04 │ 2 │ 10 │ 45 │ │ 282 │ 15 │ 2025-05-04 │ 5 │ 10 │ 45 │ │ 287 │ 15 │ 2025-05-04 │ 7 │ 30 │ 120 │ │ 292 │ 15 │ 2025-05-04 │ 8 │ 10 │ 0 │ │ 297 │ 15 │ 2925-05-04 │ 3 │ 10 │ 0 │ └─────┴──────────┴────────────┴───────┴─────────────┴────────────┘ 24 rows, 5 columns Number of Rows 24 0: 5 1: 80 2: 1 3: 1 4: 1 5: 3 6: 3 7: 1 8: 6 9: 1 10: 1 11: 1 12: 1 13: 10 14: 50 15: 1 16: 2 17: 1 18: 3 19: 8 20: 5 21: 3 22: 7 23: 2 24: 1
2
0
63
May ’25
ShieldConfigurationExtension & SwiftData
Hi, I am developing a Screen Time App and I am having issues with the ShieldConfigurationExtension (ShieldConfigurationDataSource). I know this extensions is sandboxed but I should be able to read data from the main app. I am using SwiftData as my database, but I am unable to initialize it in the extensions with an error indicating insufficient file permissions. I have App Group set up and I am able to share data using UserDefaults but that is just inconvenient. Is there any way I could just open the SwiftData in read only mode so that I could display the user some info on the shield? SwiftData Init: private func setupContainer() throws { let schema = Schema([ DogEntity.self, HouseEntity.self ]) // Use app group container if available let config: ModelConfiguration if let containerURL = FileManager.default.containerURL( forSecurityApplicationGroupIdentifier: "group.\(Bundle.app.bundleIdentifier ?? "")" ) { config = ModelConfiguration(schema: schema, url: containerURL.appendingPathComponent("default.sqlite")) } else { config = ModelConfiguration(schema: schema) } self.container = try ModelContainer(for: schema, configurations: [config]) } Error in extension: fault: Attempt to add read-only file at path file:///private/var/mobile/Containers/Shared/AppGroup/51431199-5919-4AE6-940C-6FE3C53EEB46/default.sqlite read/write. Adding it read-only instead. This will be a hard error in the future; you must specify the NSReadOnlyPersistentStoreOption. error: (3) access permission denied error: Encountered exception error during prepareSQL for SQL string 'SELECT TBL_NAME FROM SQLITE_MASTER WHERE TBL_NAME = 'Z_METADATA'' : access permission denied with userInfo { NSFilePath = "/private/var/mobile/Containers/Shared/AppGroup/51431199-5919-4AE6-940C-6FE3C53EEB46/default.sqlite"; NSSQLiteErrorDomain = 3; } while checking table name from store: <NSSQLiteConnection: 0x154100300> error: Store failed to load. <NSPersistentStoreDescription: 0x15402d590> (type: SQLite, url: file:///private/var/mobile/Containers/Shared/AppGroup/51431199-5919-4AE6-940C-6FE3C53EEB46/default.sqlite) with error = Error Domain=NSCocoaErrorDomain Code=256 "The file “default.sqlite” couldn’t be opened." UserInfo={NSFilePath=/private/var/mobile/Containers/Shared/AppGroup/51431199-5919-4AE6-940C-6FE3C53EEB46/default.sqlite, NSSQLiteErrorDomain=3} with userInfo { NSFilePath = "/private/var/mobile/Containers/Shared/AppGroup/51431199-5919-4AE6-940C-6FE3C53EEB46/default.sqlite"; NSSQLiteErrorDomain = 3; } Any help appreciated 🙂
0
0
78
May ’25
Live Lookup flow stuck at /issue/token-key-for-user-token endpoint
Hi Apple engineering team, I’m trying to integrate the new Live Caller ID Lookup (PIR) on iOS using your pir-service-example code as well as a custom mock server in Vapor, but the extension never advances past the /issue/token-key-for-user-token step. I’ve tried both: 1. Official Example Cloned https://github.com/apple/pir-service-example Ran PIRService locally Confirmed that GET /.well-known/private-token-issuer-directory → 200 GET /issue/token-key-for-user-token → 200 (DER bytes, correct SPKI) No POST /issue ever fires 2. Mock Server (Vapor) Implemented all five endpoints (/config, /.well-known/private-token-issuer-directory, /issue/token-key-for-user-token, /issue, /queries) Verified with curl and openssl asn1parse that: GET /.well-known/private-token-issuer-directory Content-Type: application/private-token-issuer-directory { "issuer-request-uri":"https://…/issue", "token-keys":[…] } GET /issue/token-key-for-user-token Content-Type: application/octet-stream &lt;DER bytes&gt; Added Cache-Control: public, max-age=3600 on directory and SPKI Stubbed POST /issue to always return { "token": "" } Still no POST /issue request from the extension Reproduction Steps Install and enable a Live Lookup extension pointing to my server. Trigger an incoming call on device. Watch server logs—only see the two GETs, never /issue or /queries. Expected Behavior After fetching the SPKI DER, the framework should issue a POST /issue call (Privacy Pass flow) and then POST /queries. Observed Behavior Stuck in an infinite loop of: GET /.well-known/private-token-issuer-directory GET /issue/token-key-for-user-token (repeat…) No progression to the /issue or /queries endpoints. What I’ve Tried Verified JSON kebab-case and headers exactly match examples Confirmed SPKI DER is valid via openssl asn1parse Added Cache-Control headers Tested on real device, localhost url, and ngrok public URL Mocked a valid-looking token response Could you advise what additional requirement or format detail I’m missing that prevents from advancing past /issue/token-key-for-user-token? These are the main files: LiveLookupExtension.swift routes.swift service-config.json Thanks in advance!
0
0
59
May ’25
Pushkit/Callkit with unlocked SIM before first unlock
We have a problem in a scenario that SIM lock is disabled so after a phone reboots it has the Internet connection but it is still locked. When you call into the VOIP app the app is not being launched as the result (it seems reasonable because it wouldn't be able to access the keychain items etc...) but the OS still seem to enforce the rule that the app needs to report the new incoming call. When we then unlock the app we can see no more pushkit pushes are arriving (dropped on the floor in the console) but we get the three initial pushes that were send during the locked phase right after the app launch.
4
0
134
May ’25
0xBAADCA11 Occurs when VoIP Call Incoming (APNs Push Notification)
I am developing a VoIP phone application(Our Phoneapp) using APNs VoIP push. I have a question regarding a behavior I discovered during testing of this application. When performing the following operations using an iPhoneSE3 with an sXGP-NW SIM inserted, 0xBAADCA11 occurs upon receiving an APNs VoIP PUSH. Do you have any information regarding this issue? 0xBAADCA11 occurs in operation 8. However, since there were no problems in operation 4 (the app works when Wi-Fi is off), I think there is no issue with the Our Phoneapp. [Configuration of system components] [VoIP Telephone] --Call to iPhone(Phoneapp)--> [Our VoIP PBX Server] -- VoIP PUSH request --> [Apple APNs Server] -- VoIP PUSH --> [Our Phoneapp (iPhoneSE3(with sXGP SIM)] [Operations] (The issue is reproducible 100% by following oparation) iPhoneSE3: Power on (iPhoneSE3 with sXGP SIM) iPhoneSE3: Wi-Fi off, connect to the internet via SIM. VoIP Telephone: Call to Our Phoneapp iPhoneSE3: Receives VoIP PUSH and Phoneapp launches. Successfully answers the call and communication is possible. (Receives VoIP push notification from APNs via sXGP SIM) iPhoneSE3: Wi-Fi is turned ON, connect to the internet via Wi-Fi. iPhoneSE3: Task kill Our Phoneapp. VoIP Telephone: Call to Our Phoneapp iPhoneSE3: iOS does not call the push notification delegate (didReceiveIncomingPushWithPayload). As a result our Phoneapp is unable to detect the incoming call, However, an ips log with 0xBAADCA11 is output. in other words, iOS received the VoIP PUSH, but Our Phoneapp dose not call CallKit, so Our Phoneapp was terminated by iOS.
13
0
229
May ’25
Need to check Call Status without using CallKit
My app requirement is to check that User is on call while doing transaction. If user on call then we need to show Caution alert. For this requirement we used CallKit to detect Call status and it's working fine but recently Apple has rejected the application because of Callkit that is banned in China. Could you please provide any solution to check the Call Status only.
3
0
82
May ’25
How can I access app usage data from other apps in iOS?
Hi, I’m developing an iOS application and want to explore if there are any official methods available to monitor or retrieve information about the usage patterns of other apps installed on a user’s device — such as launch time, duration of use, or app switching behavior. I understand Apple enforces strict privacy policies. My question is: Are there any APIs or frameworks (public or private) that allow reading app usage data from other apps? Can Screen Time or DeviceActivityReport frameworks be leveraged for such use? Would an app like this be eligible for App Store approval, or would it require special entitlements? My intent is not to violate privacy, but to explore if Apple allows any of this under Screen Time APIs or family usage scenarios. Any insights or guidance would be appreciated! Thanks, [Your Name]
1
0
58
May ’25
Silly question: getting a user's email address(es)
For login purposes, we may want to try automatically checking to see if an email address is set up in certain databases. It looks like the preferred way to do this is via ABAddressBook.shared().me(), then get the right key via in the properties? This, however, is treated as accessing the whole address book and brings up a confirmation dialogue. However, as I thought about it, that might not be the real way we'd want -- we'd want to go through Active Directory, perhaps? Am I making any sense, or just being incoherent? 😄
8
0
90
May ’25
Live Caller ID Lookup: OHTTP Gateway Rejection
Hello, After submitting onboarding form for Live Caller ID Lookup feature, we received rejection response that our OHTTP gateway doesn't support HTTP/2. We have run provided command openssl s_client -alpn h2 -connect with our domain several times from different machines and environments, and our results consistently confirm that HTTP/2 is indeed supported by our OHTTP gateway. The output clearly shows ALPN protocol: h2, indicating successful HTTP/2 negotiation. Here is the log chunk from the command-line response: No client certificate CA names sent Peer signing digest: SHA256 Peer signature type: RSA-PSS Server Temp Key: X25519, 253 bits --- SSL handshake has read 4393 bytes and written 406 bytes Verification: OK --- New, TLSv1.3, Cipher is TLS_AES_128_GCM_SHA256 Server public key is 2048 bit This TLS version forbids renegotiation. Compression: NONE Expansion: NONE ALPN protocol: h2 Early data was not sent Verify return code: 0 (ok) --- DONE We have also tried different 3rd-party services to check the HTTP/2 support and they also confirmed that HTTP/2 is supported. Is it possible to provide additional details on the specific criteria or test conditions that led to its non-approval? I'm happy to provide any further diagnostic information or engage in more detailed technical discussion.
1
7
115
May ’25
Screen Time API: How to map bundleIdentifier to ApplicationToken for DeviceActivityMonitor when FamilyActivitySelection.Application.bundleIdentifier is nil?
I'm using FamilyActivityPicker to get consent for app/category management, which returns a FamilyActivitySelection object. I serialize this FamilyActivitySelection object (just applicationTokens and categoryTokens) and pass it to my DeviceActivityMonitor extension via App Group UserDefaults. I am using the JSON encoder/decoder over PropertyList (though both seem to exhibit the same behavior). After inspecting the FamilyActivitySelection object immediately after it's returned by FamilyActivityPicker in the main app, the application.bundleIdentifier property is consistently nil for every Application object within selection.applications. Similarly, category.localizedDisplayName is nil for ActivityCategory objects. This happens whether "Select All Apps" is used or if apps/categories are selected individually. I understand that this is the intended behavior due to Apple's user privacy policies. I read on another post that my app can be provided with bundle identifiers and app names within Shield Configuration extensions and Device Activity Report extensions - I'm not sure which ones or how exactly to do this. I am aware that I can use Label(applicationToken) SwiftUI view to display the app name/icon, but this doesn't give programmatic access to the bundleIdentifier string. My app will not log or export these bundleIdentifiers outside of its sandbox. My goal is to create mappings to the FamilyActivitySelection with the publicly accessible bundleIdentifiers. Any guidance, examples, or clarification on the intended workflow for this scenario would be greatly appreciated!
0
0
98
May ’25
There were problems encountered during the development of core spotlight.
In IOS17 and IOS18, core spotlight can only match app contents by searching for the displayName, but cannot hit the contents by using keywords. Moreover, when matching the app content by searching for the "displayName", it requires inputting four consecutive characters to achieve a match.These issues did not occur in iOS 16. What is the reason for this? Here is my code. func addItemToIndex(_ item: QSpotlightItem) { let attributeSet = CSSearchableItemAttributeSet(contentType: .item) attributeSet.title = item.title attributeSet.displayName = item.title attributeSet.contentDescription = item.contentDescription attributeSet.keywords = item.keywords attributeSet.thumbnailData = item.thumbnailImage attributeSet.contactKeywords = item.keywords attributeSet.supportsNavigation = true let searchableItem = CSSearchableItem(uniqueIdentifier: item.id, domainIdentifier: "xxx", attributeSet: attributeSet) searchableItem.expirationDate = .distantFuture CSSearchableIndex.default().indexSearchableItems([searchableItem]) { error in if let error = error { } else { } } }
9
3
183
May ’25
Phonetic vs Pronunciation contacts fields and Siri
The contacts app has fields for Phonetic and Pronunciation. My app adds phonetic data to the phonetic field to help Siri better understand contacts stored in Greek, Cyrillic, or Georgian. However, using the phonetic field causes the sorting order of contacts to be messed up. For example, Greek B (beta) is represented as a phonetic sound of V, resulting in a completely incorrect sorting order. The pronunciation field doesn’t seem to affect the sorting order, but I’m not sure what it does or should do. My questions are: Do we understand the difference between phonetic and pronunciation, and how Siri actively uses them? If the phonetic field is the correct one to use, how can we raise a feature request with Apple to add an option to sort contacts based on phonetic fields or not? Here’s a test you can try: Create a new contact with the following details: First name: test Last name: test Phonetic first name: Billy Phonetic last name: Idol Ask Siri to show the contact Billy Idol. It will return the “test test” contact. Switch from the phonetic to the pronunciation fields. Now, Siri won’t find Billy Idol.
5
0
83
May ’25
Call directory extension name not localised in settings iOS 15.3
I have two call directory extensions, each with InfoPlist.strings in en.lproj and nb.lproj directories. In these files I've defined CFBundleDisplayName for both locales. These names are displayed under Settings -> Phone -> Call Blocking & Identification. On iOS 12.2 the names are displayed correctly in both Norwegian and English. Testing on iOS 15.3 the English names are displayed even when device language is set to Norwegian. Worth noting: When updating the English versions of CFBundleDisplayName this is immediately reflected in Call Blocking & Identification page with Norwegian device language. As this feature requires a real device, I'm unable to test on iOS 13 and 14.
7
0
2.1k
May ’25
Increased Memory Limit for QL Preview Generator
we have written a QL preview generator for some 3D data formats not supported by the AR Quicklook generators included in iOS. however, we are struggeling with the 100 MB memory ceiling imposed on an app extension in iOS. we have included the "Increased Memory Limit" entitlement in both the app and in the preview extensions. nevertheless, the preview generator is limited to 100 MB, even on most recent devices like iPhone 16 Pro Max. we can even see 100 MB limit when we attach to the process with Xcode. my question: did we miss anything? are there additional steps necessary to obtain the increased memory limit? must we explicitly apply for it? 500 MB would be fine (our preview generator does not load textures). best regards
2
0
68
May ’25