Delve into the world of built-in app and system services available to developers. Discuss leveraging these services to enhance your app's functionality and user experience.

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

Background Modes for Audio Playback
Summary: I'm developing an iOS audio app in Flutter that requires background audio playback for long-form content. Despite having a paid Apple Developer Program account, the "Background Modes" capability does not appear as an option when creating or editing App IDs in the Developer Portal, preventing me from enabling the required com.apple.developer.background-modes entitlement. Technical Details: In the app that I am developing, users expect uninterrupted playback when app is backgrounded or device is locked similar to Audible, Spotify, or other audio apps that continue playing in background The Problem: When building for device testing or App Store submission, Xcode shows: Provisioning profile "iOS Team Provisioning Profile: com.xxxxx-vxxx" doesn't include the com.apple.developer.background-modes entitlement. However, the "Background Modes" capability is completely missing from the Developer Portal when creating or editing any App ID. I cannot enable it because the option simply doesn't exist in the capabilities list. What I've Tried: Multiple browsers/devices: Safari, Chrome, Firefox, incognito mode, different computers Account verification: Confirmed paid Individual Developer Program membership is active New App IDs: Created multiple new App IDs - capability never appears for any of them Documentation review: Followed all Apple documentation for configuring background execution modes Different regions: Tried changing portal language to English (US) Cache clearing: Logged out, cleared cookies, tried different sessions Apple Support Response: Contacted Developer Support (Case #102633509713). Received generic documentation links and was directed to Developer Forums rather than technical escalation. Has anyone else experienced the "Background Modes" capability missing from their Developer Portal? Has anyone successfully used the App Store Connect API to add background-modes when the GUI doesn't show it? What's the proper escalation path when Developer Support provides generic responses instead of technical assistance? Things I have attempted to solve this: audio_service package: Implemented as potential workaround, but still requires the system-level entitlement Manual provisioning profiles: Cannot create profiles with required entitlement if capability isn't enabled on App ID Other perhaps important facts about the environment where I am building the app: macOS Sonoma Xcode 15.x Flutter 3.5.4+ Apple Developer Program (Individual, paid)
0
0
63
2w
A Summary of the WWDC25 Group Lab - watchOS (Part 2)
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 2). 7. For widget (complication) update budgets, is there an overall budget or are scheduled update separate from APNS updates? For context I have a complication that is updated on a fixed schedule (every 20 min), but there can be times of the day that are more "interesting" where pushes make sense. Like timeline updates, the system budgets WidgetKit push notifications and delivers them opportunistically. You can use WidgetKit push notification updates as an addition to timeline updates. For more information, see Updating widgets with WidgetKit push notifications. 8. It seems like the new Control Center widgets can be sourced from either the iPhone or directly on the Watch. Can we control whether a control appears in the watch list, or will it always be a combination of all controls from both sources? iPhone controls will be automatically available on the companion Apple Watch, even if they don’t have an associated watchOS app. When an iPhone control is tapped on the Apple Watch, the action is performed on the iPhone. Controls whose actions foreground the iOS app will not appear on Apple Watch. If a watchOS app has controls, no controls will appear on Apple Watch from the companion iOS app. 9. From UI/UX perspective, what are the current practices for Designing watchOS apps that feels native. The WWDC23 session Design and build apps for WatchOS 10 covers the details of watchOS design principles and how to apply them in your app using SwiftUI. A lot of SwiftUI APIs, such as NavigationSplitView, vertical tab view, list view, and etc, already implement the look and feel native to watchOS. 10. When adopting the new design system on watchOS, it seems like the main place we will use the glass effect is for our buttons in toolbar? Standard buttons in system apps seem to continue to use a flat appearance and full width. We leave the choice to you – You can use the new GlassButtonStyle API or .buttonStyle(.glass) to apply the liquid glass material to buttons. Learn when to use the Liquid Glass styles in Get to know the new design system. 11. Is there any way to gracefully migrate extensions when their bundleIDs have to change? e.g., converting a multi-target watch app to single-target, which drops the .watchkitextension from both the app and WidgetKit ext bundleIDs Updating a watchOS app to single-target is covered in TechNote TN3157: Updating your watchOS project for SwiftUI and WidgetKit. Xcode provides a tool that can do the update automatically, and the technote describes the details about how to use it and how to clean up the project after the automatic update. If there's something that technote doesn't address, please reach out to us on the Developer Forums. 12. What is the status of WatchConnectivity? Is that still the preferred way for iOS + watchOS communications? The Watch Connectivity framework is still supported, and is appropriate for the communication between an watchOS app and its companion iOS app. The systems also provide other APIs for the apps to exchange data. For example, watchOS supports Apple Push Notification service (APNs). If data for your widget changes on your server, your widget can receive a WidgetKit push notification, and update accordingly. That’s the preferred mechanism for widget updates.
0
0
62
2w
A Summary of the WWDC25 Group Lab - watchOS (Part 1)
At WWDC25 we launched a new type of Lab event for the developer community - Group Labs. A Group Lab is a panel Q&A designed for a large audience of developers. Group Labs are a unique opportunity for the community to submit questions directly to a panel of Apple engineers and designers. Here are the highlights from the WWDC25 Group Lab for watchOS (part 1). 1. I'm really excited about the new design system on all platforms. Liquid Glass is super cool. What do developers need to keep in mind when building for watchOS 26? To adopt the new design system, start with updating your app for watchOS 10 – If you have done so, your app will be mostly ready for watchOS 26. For more information, see Design and build apps for WatchOS 10. You can then look into Liquid Glass specific APIs to fine tune your app. This topic is covered in Adopting Liquid Glass. If you have SwiftUI views using any custom style, make sure they are still legible and fit with the new design system. 2. Something that really stood out to me were updates to the Smart Stack, with the system prioritizing Widgets when they're most relevant. Tell me more about these new opportunities for apps. Workout apps that record workouts using HealthKit may be automatically suggested on the watch face and appear in the Smart Stack without adding a widget. Relevant widgets are a great way to present information related to a date, location, point-of-interest type, sleep schedule, or fitness condition in the Smart Stack when it is relevant. Relevant widgets don't need to display a empty state view when they are not relevant. They are only shown in the Smart Stack when relevant. The watchOS 26 Design ToolKit in the Apple Design Resources includes a set of templates that you can use to layout your widgets. 3. Is the Wrist Flick gesture available to developers in the same way as Double Tap is? The system uses Wrist Flick to dismiss notifications and incoming calls, silence timers and alarms, or return to the watch face. There is no separate API for the Wrist Flick gesture. Apps that are using XCUIAutomation to make sure their user interface behaves as intended can use the XCUIDeviceHandGesture.flick to automate tests that verify that their app responds appropriately to the Wrist Flick gesture. For apps using automated testing, the XCUIDeviceHandGesture.doubleTap can be also be used to automate testing of the app with the Double Tap gesture. See XCUIDevice.perform(handGesture:) 4. Can HRV measurements be triggered on demand via API in watchOS? Guidelines or processes for enabling energy-intensive biometric sampling on development devices for IRB-approved research? You don’t have direct control on the sampling rate in watchOS. You can use HealthKit (HKQuantityTypeIdentifierHeartRateVariabilitySDNN, to be specific) to query the HRV data, once the system has sampled and persisted the data to HealthKit. If that doesn’t help, we suggest that you file a feedback report with your concrete use case for us to investigate. Specific to IRB-approved research using Apple Watch or its companion iPhone, you might want to look at this FAQ and SensorKit to see if they can be of any help. 5. What is the best advice for someone who is new to making a watchOS app that’s been on iOS and iPadOS? You can start with exploring the system experience features on watchOS, such as notifications, controls, and widgets, and getting familiar with the system spaces, like Smart Stack, watch face, and control center. Knowing the watchOS app design principles and practices is important as well. Design and build apps for WatchOS 10 is a great resource for this topic. SwiftUI is an amazing across-platform framework, and you will use it to create your watchOS app. If you're already using it, great! Keep in mind some watch-only constraints. Comparing to iPhone or iPad, Apple Watch has a limited battery and smaller screen size, which significantly impacts how people use your app and how your app works. 6. Was there any extension this year to the 7 day limit on querying Apple Health data on the watch? There is no change on the limit this year. You can get this official limit at runtime using earliestPermittedSampleDate. There are some exceptions, and so don't be surprised if you see some data types are retained longer. The companion iPhone holds the full set of the health data. If you need to access the health data that has been purged from the Apple Watch, consider doing it with your iOS app, and then passing the result to your watchOS app.
0
0
78
2w
How to get a phone into a state where it's possible to test text filtering?
I'm currently finding it impossible to get a text filtering extension to be invoked when there's an incoming text message. There isn't a problem with the app/extension because this is the same app and code that is already developed, tested, and unchanged since I last observed it working. I know if there's any history of the incoming number being "known" then the extension won't get invoked, and I used to find this no hindrance to testing previously provided that: the incoming number isn't in contacts there's no outgoing messages to that number there's no outgoing phone calls to the number. This always used to work in the past, but not anymore. However, I've ensured the incoming text's number isn't in contacts, in fact I've deleted all the contacts. I've deleted the entire phone history, incoming and outgoing, and I've also searched in messages and made sure there's no interactions with that number. There's logging in the extension so I can see its being invoked when turned on from the settings app, but its not getting invoked when there's a message. The one difference between now and when I used to have no problem with this - the phone now has iOS 18.5 on it. Its as if in iOS 18.5 there ever was any past association with a text number, its not impossible to remove that association. Has there been some known change in 18.5 that would affect this call filtering behavior and not being able to rid of the incoming message caller as being "known" to the phone? Update I completely reset the phone and then I was able to see the the message filter extension being invoked. That's not an ideal situation though. What else needs to be done beyond what I mentioned above in order to get a phone to forget about a message's number and thus get an message filtering extension to be invoked when there's a message from that number?
0
0
129
2w
Detecting When App Token Is Removed Due to Category Selection in FamilyActivityPicker
I'm working with the Screen Time API (FamilyActivityPicker) in SwiftUI and need help with a specific scenario. I'm using the FamilyActivityPicker to let users select apps and categories to block. I save the previous selection (both applicationTokens and categoryTokens) locally. When the user updates their selection, I compare the new selection with the saved one to determine which apps or categories were removed. However, I’m trying to handle a specific case: when an individual app token is removed from the selection because its entire category was selected instead. In this situation, even though the app is no longer in applicationTokens, it's still blocked due to its category being included in categoryTokens. Since I need to show users which apps were actually removed, I want to avoid listing apps that are still indirectly blocked via a selected category. I’ve created a mapping between ApplicationToken and FamilyActivityCategoryToken to check whether a removed app is still covered by a selected category before displaying it. Is there any way to check this using the current Screen Time APIs, or does the system not give access to the relationship between apps and their categories? Any help or suggestions would mean a lot!
2
1
118
2w
Translation API not working without the Apple Translate app installed
Hello, My app depends on the Translation framework (iOS 17.4+), but I've found out that it doesn't work unless the Apple Translate app is installed on the device. After I've deleted Apple's translation app, I started getting the following errors: Optional(Foundation.Locale.Language(components: Foundation.Locale.Language.Components(languageCode: Optional(en), script: nil, region: Optional(GB)))) Optional(Foundation.Locale.Language(components: Foundation.Locale.Language.Components(languageCode: Optional(es), script: nil, region: Optional(ES)))) Error sending 1 paragraphs Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair} Failed to translate input 0; returning error: Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair} Received unbridged NSError to API, converting to `.internalError`: Error Domain=TranslationErrorDomain Code=16 "Translation failed" UserInfo={NSLocalizedDescription=Translation failed, NSLocalizedFailureReason=Offline models not available for language pair} TranslationError(cause: Translation.TranslationError.Cause.internalError, sourceLanguage: nil, targetLanguage: nil) This is an example from trying to translate text from English to Spanish. And I was receiving the error even though I have the dictionaries downloaded. Once I reinstalled Apple's Translate app, it started working again. This sadly means that users of my app must have the factory translation app installed, otherwise they won't be able to use my app. Some people choose to delete the factory apps. Why is the framework not available then? :(
1
0
33
2w
The menu can't be shown in background process in MacOS 26(beta)
After I upgraded to MacOS 26(beta), my program caused the system to pop up a window as shown in the following picture. My application is a process with only a tray icon. I found that my tray icon is not displayed in the current version, even though I clicked the "Always Allow" button. Here are my questions: 1.Will this related feature remain consistent in the official release? 2.How can I create a cmd process that only displays a system tray icon (no main window) like Alfred?
2
1
59
2w
Our customer's events on calendar are disappeared
Our app provides a calendar that integrates with the default calendar app. Specifically, we use iOS EventKit to perform CRUD operations on calendar data. Recently, we have received reports from users that all of their events have disappeared. However, after reviewing our implementation and logs, we have not been able to identify the cause. Some users have also reported that all data in their default calendar app has disappeared as well. Does anyone have any idea what might be causing this? To delete an event within our app, users must press the delete button and then confirm the deletion in a dialog. Additionally, it is not possible to delete more than two events at once. We've seen many people in the community discussing a bug where calendar events disappear after updating to iOS 18. If you have any information about when or why this happens, we'd appreciate it if you could share your insights.
0
3
69
2w
FamilyControls Framework Not Working for TestFlight Testers
Hello everyone, I’m developing an app using the FamilyControls framework, I distributed through TestFlight the other day using the “Family Controls” distribution (not Development). Everything works as expected in dev builds — but for external TestFlight testers, nothing in the FamilyControls framework seems to function. I'm using the correct Family Controls capability in Xcode (added via Signing & Capabilities). The com.apple.developer.family-controls entitlement is present in my .entitlements file. All the users who reported the issue had correctly given screen time permissions to the app. Would really appreciate some help regarding where the issue could come from.
1
0
104
2w
Access resource in swift package from xcframework
I have an iOS app that includes a local Swift package. This Swift package contains some .plist files added as resources. The package also depends on an XCFramework. I want to read these .plist files from within the XCFramework. What I’d like to know is: Is this a common or recommended approach—having resources in a Swift package and accessing them from an XCFramework? Previously, I had the .plist files added directly to the main app target, and accessing them from the XCFramework felt straightforward. With the new setup, I’m trying to determine whether this method (placing resources in a Swift package and accessing them from an XCFramework) is considered good practice. For context: I am currently able to read the .plist files from the XCFramework by passing Bundle.module through one of the APIs exposed by the XCFramework.
3
1
111
3w
No such module 'JournalingSuggestions'
I followed this tutorial to add JournalingSuggestions API, but it keeps showing me No such module 'JournalingSuggestions'. How can I fix this? import SwiftUI import JournalingSuggestions struct ContentView: View { @State var suggestionTitle: String? = nil var body: some View { VStack { JournalingSuggestionsPicker { Text("Select Journaling Suggestion") } onCompletion: { suggestion in suggestionTitle = suggestion.title } Text(suggestionTitle ?? "") } .padding() } } #Preview { ContentView() }
2
0
39
3w
iOS magnetometer data processing
Hello, I’m developing an app to detect movement past a strong magnet, targeting both Android and iOS. On Android, I’m using the Sensor API, which provides calibrated readings with temperature compensation, factory (or online) soft-iron calibration, and online hard-iron calibration. The equivalent on iOS appears to be the CMCalibratedMagneticField data from the CoreMotion framework. However, I’m encountering an issue with the iOS implementation. The magnetometer data on iOS behaves erratically compared to Android. While Android produces perfectly symmetric peaks, iOS shows visual peaks that report double the magnetic field strength. Additionally, there’s a "pendulum" effect: the field strength rises, drops rapidly, rises again to form a "double peak" structure, and takes a while to return to the local Earth magnetic field average. The peaks on iOS are also asymmetric. I’m wondering if this could be due to sensor fusion algorithms applied by iOS, which might affect the CMCalibratedMagneticField data. Are there other potential reasons for this behavior? Any insights or suggestions would be greatly appreciated. Thank you!
0
0
20
3w
Bug in Screen Time API: familyActivityPicker dismisses a presenting sheet on iOS 18.4 and above
Hello, I’m presenting the familyActivityPicker from a presented sheet in my application. When I select some apps, categories or websites and tap “Done”, the familyActivityPicker is dismissed but the presenting sheet is also dismissed on iOS 18.4, iOS 18.5, iOS 26 beta 1 and 2. If I tap on “Cancel” from the familyActivityPicker, the sheet is also dismissed on iOS 18.4, iOS 18.5, iOS 26 beta 1 and 2. The same code works perfectly fine on iOS 18.0, iOS 18.1, iOS 18.2 and iOS 18.3. Is this a known-issue? If opened the feedback FB18369821 for this. Regards, Axel
2
0
54
3w
NSuserdefault issue after restart my iphone
hi,guys.There's a issue about my app about NSuserdefault. Everything is arlright if i stay in the app, once i close my app, and restart it.Datas from nsuserdefault is gone(nil). i tried to add and delete synchronize method , but its not working. But this situation only happens in ios 18.(at least ios12 and ios16 is alright).
3
0
88
3w
Quick Look Preview Extension works on macOS but not iOS
Hi, I have a document-based SwiftUI multiplatform app, where the document is saved as JSON. Obviously I don't want Quick Look to show the JSON of my file, so I made a Quick Look Preview extension for each platform. The macOS one works… okay, sometimes it's tricky to test and I need to use qlmanage to empty the cache or to show the preview, but it does work. I can even debug it. But the iOS one just never seems to be run. If I show Quick Look in the Files app on iOS, or if I AirDrop a file from my app to my iPhone, it shows as JSON, with an option to open it in my app. If I run the iOS Preview Extension in the debugger and launch the Files app, and then try to use Quick Look in a file there, it shows the JSON and the debugger just stays in the state 'Waiting to Attach'. The preview extensions are not data based; in both of them I have implemented func preparePreviewOfFile(at url: URL) async throws in PreviewViewController.swift. Pretty much the same code except one is using a UIHostingController and the other is using an NSHostingController. The only difference in the Info.plists for the two extensions is that the iOS one uses NSExtensionMainStoryboard while the macOS one uses NSExtensionPrincipalClass. This is how they were set up when I created them from the relevant Quick Look Preview Extension templates. I made a sample project with a much simpler document, UI, etc. and I have the same issue there. The macOS preview works, the iOS one never runs. I have checked that the correct preview extension is embedded in the target for each OS (under Embed Foundation Extensions in the Build Phases) Is there anything I need to do differently in iOS, or anything I might have inadvertently got wrong? Is there a way to run something similar to qlmanage on iOS, since that sometimes seems to help on macOS? Incidentally, I have also added Quick Look Thumbnail extensions, and they work on both platforms.
2
0
104
3w
Delays When Creating Advanced App Clip Experiences for Other Businesses
Hey there, I have an app where I create custom Advanced App Clip Experiences for other businesses which seems to be a valid thing. I do create them via API. Upon creation everything looks fine: when I go to App Store Connect -> App -> Advanced App Clip Experiences, I do see the new App Clip Experience I've just created. Their status is Received (as any other active experiences) and have a custom URL. The issue is weird timing when the Advanced App Clip Experience actually becomes available on the iPhone (can be triggered via App Clip Code, etc). Some experiences become available literally immediately but others take days (some take 1-2 days, some take ~5 days). I'm not sure why there's a bid difference for an Advanced App Clip to be actually active. Does anyone have any kind of experience with that? I don't change domain settings, app's settings, etc. I'm just creating a new experience (both via API or manually at App Store Connect) and I do have different "activation" times for different App Clips. Same when I delete an Advanced App Clip Experience, it will still be available for next couple days. I get there might be caching stuff, etc. But the difference is quite huge and makes no sense since as I've mentioned some clips become available immediately but some takes days to be available. Thank you!
0
0
43
3w
WeatherKit failing on JWT Error
My app AirCompare has been in the app store and successfully using WeatherKit to fetch weather since it became available. Now some (not all) users are encountering the following errors: Failed to generate jwt token for: com.apple.weatherkit.authservice with error: Error Domain=WeatherDaemon.WDSJWTAuthenticatorServiceListener.Errors Code=2 "(null)" Encountered an error when fetching weather data subset; location=<+42.40865786,-88.96911526> +/- 0.00m (speed -1.00 mps / course -1.00) @ 6/23/25, 2:56:47 PM Central Daylight Time, error=WeatherDaemon.WDSJWTAuthenticatorServiceListener.Errors 2 Error Domain=WeatherDaemon.WDSJWTAuthenticatorServiceListener.Errors Code=2 "(null)" Others are reporting this same problem here in the forums. We need a solution!
5
4
112
3w