Hello,
I have an app which I have enabled VoIP entitlement and implemented all the CallKit and PushKit registries and delegates.
I can successfully get a VoIP token.
I can successfully send VoIP push notifications (and receive them via the PushKit delegate function) and then report an incoming call via CallKit, but only while my app is in the foreground.
I have checked the entitlement in XCode and the Info.plist directly, and they both (as expected) show voip as a background mode.
The VoIP notification is being sent via AWS SNS and everything works while the app is in the foreground. I cannot understand why the app is not waking up while in the background.
This is the VoIP notification sent via SNS:
aps: {
alert: "Intercom call",
"content-available": 1
}
SNS Message Attributes:
'AWS.SNS.MOBILE.APNS.TOPIC': {
DataType: 'String',
StringValue: `${bundleId}`
},
'AWS.SNS.MOBILE.APNS.PUSH_TYPE': {
DataType: 'String',
StringValue: 'voip'
},
'AWS.SNS.MOBILE.APNS_VOIP.TTL': {
DataType: 'String',
StringValue: '0'
},
'AWS.SNS.MOBILE.APNS_VOIP_SANDBOX.TTL': {
DataType: 'String',
StringValue: '0'
},
'AWS.SNS.MOBILE.APNS.PRIORITY': {
DataType: 'String',
StringValue: '10'
}
As I say,
func pushRegistry(_ registry: PKPushRegistry, didReceiveIncomingPushWith payload: PKPushPayload, for type: PKPushType) async
works correctly when in the foreground. I cannot see any reason why it would not work from the background.
I am also receiving normal remote notifications correctly foreground and background.
CallKit
RSS for tagDisplay the system-calling UI for your app’s VoIP services and coordinate your calling services with other apps and the system using CallKit.
Posts under CallKit tag
148 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hello everyone,
I’m developing a VoIP-based application that supports both standard VoIP calls and Push-To-Talk (PTT) calls. The app does not use the unrestricted-voip entitlement since it’s not publicly documented or communicated as a standard by Apple.
Previously, I handled PTT calls using CallKit after receiving PushKit notifications, but I’m now migrating PTT functionality to the PushToTalk Framework while keeping CallKit for standard VoIP calls. I’m facing a few challenges that I’d like help with:
Handling Incoming Push-To-Talk Calls When the App Is Closed and the Device Is Locked
I considered continuing to use PushKit notifications to alert users via CallKit and using CallKit until the user brings the app into the foreground, at which point I’d switch to the PushToTalk Framework. While this could technically work, the user experience is not ideal. Are there any recommended approaches for handling PTT calls in this state?
Handling Incoming PTT Calls When the App Is in the Background
According to Apple documentation, I cannot join a PTT session unless my app is in the foreground. However, in practical scenarios, we often receive incoming PTT calls while the app is in the background. What’s the best solution for this situation? It feels odd to show notifications or use CallKit until the app is foregrounded.
Conflict Between Ongoing PushToTalk Call and Incoming Cellular Call
Currently, if there’s an ongoing PushToTalk call using the PTT framework and a cellular call comes in, if I receive a PTT transmission and call requestBeginTransmission, the cellular call is ended. I can handle this within my app, but is this expected behavior? Is this the intended conflict management for concurrent PTT and cellular calls?
Lastly, a broader question: when will the unrestricted-voip entitlement stop working? I’m contemplating using this entitlement to handle incoming PTT calls without CallKit, but I’m concerned about its longevity. Some apps have been using it for messaging and other features for over four years, and it’s still functional for them.
Any guidance or insights on these points would be greatly appreciated!
Thanks in advance!
Hello. We are using Twilio Video SDK and CallKit to report an incoming invite to join a video room. We noticed that on iOS 18 beta when accepting the incoming call invite using Bluetooth headphones while the application is in the foreground, the CallKit screen for a moment enters the foreground before going into background.
On iOS 17 when the call invite is accepted, the application remains in foreground and the CallKit screen is instantly sent to background.
This is not reproducible when using Apple EarPods on iOS 18 with a Lightning connector.
This seems like a minor change. But it would be nice to know if this is an intentional change or a possible issue.
Hello. We are using Twilio Video SDK and CallKit to report an incoming invite to join a video room.
On iOS 18 trying to decline a call invite via headphones (long pressing the accept button) doesn't actually result in the call being declined.
There seem to be different results depending on the device being used. When using Apple EarPods with Lightning connector or a Bluetooth JBL headset, the call is declined only on a second attempt. When using a Bluetooth Jabra BT2046 headset, the call gets accepted instead on the first decline attempt.
This issue is not reproducible on iOS 17.
Hello.
We are using Twilio Video SDK and CallKit to report an incoming invite to join a video room. On iOS 18 beta when the user receives the incoming call invite and taps the CallKit decline button, nothing happens. The incoming call UI is still visible. Only after tapping the decline button a second time the call invite is actually ended.
The provider(CXProvider, perform: CXEndCallAction) method is called both times when tapping the decline button twice.
Strangely enough, it is also called when tapping the accept button on iOS 18.
This is working fine on iOS 17 and there haven't been any recent code changes in the application.
Did anyone else encounter a similar issue with CallKit on iOS 18 beta ?
Hello,
We are implementing an mVOIP service using CallKit.
I have a question.
When receiving a call with CallKit, the CXEndCallAction callback is received by the provider after one minute. We didn't request this separately on our end.
Is this a policy from Apple?
If so, is it possible to modify this behavior, and are there any related APIs or documentation?
Thank you.
If an iPhone receives an incoming call with some partial sip content (for example it contains a name but not an image, or vice versa) and if there is an app enabled for Live Caller ID Lookup, and the result of that lookup supplies data not in the sip (i.e. the lookup returns an image, but not a name, or vice versa). Then could the OS combine data from both sources, or is whatever is returned from the LCIDL what gets displayed in the call screen. I suppose that is the case but just want to enquire to make sure.
Thank you
I can reproduce the bug that CallKit doesn't active audiosession after the outgoing call put on hold because of an incoming call.
VoIP calling with CallKit
Steps to reproduce:
Download SpeakerBox example app from the link above and start it with XCode
Start a new outgoing call
Call your phone from other phone
Hold and Accept the call
After a few secs finish the call from the other phone
The outgoing call will be still on hold
Click on the call and click Toggle Hold
The call won't be active again because the audiosession is activated.
Logs:
Received provider(_:didDeactivate:)
Received provider(_:didDeactivate:)
Received provider(_:didDeactivate:)
Received provider(_:didDeactivate:)
Received provider(_:didDeactivate:)
Requested transaction successfully
Starting audio
Type: stdio
AURemoteIO.cpp:1162 failed: 561017449 (enable 3, outf< 1 ch, 44100 Hz, Float32> inf< 1 ch, 44100 Hz, Float32>)
Type: Error | Timestamp: 2024-08-15 12:20:29.949437+02:00 | Process: Speakerbox | Library: libEmbeddedSystemAUs.dylib | Subsystem: com.apple.coreaudio | Category: aurioc | TID: 0x19540d
AVAEInternal.h:109 [AVAudioEngineGraph.mm:1344:Initialize: (err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)): error 561017449
Type: Error | Timestamp: 2024-08-15 12:20:29.949619+02:00 | Process: Speakerbox | Library: AVFAudio | Subsystem: com.apple.avfaudio | Category: avae | TID: 0x19540d
Couldn't start Apple Voice Processing IO: Error Domain=com.apple.coreaudio.avfaudio Code=561017449 "(null)" UserInfo={failed call=err = PerformCommand(*outputNode, kAUInitialize, NULL, 0)}
Type: Notice | Timestamp: 2024-08-15 12:20:29.949730+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
Route change:
Type: Notice | Timestamp: 2024-08-15 12:20:30.167498+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
ReasonUnknown
Type: Notice | Timestamp: 2024-08-15 12:20:30.167549+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
Previous route:
Type: Notice | Timestamp: 2024-08-15 12:20:30.167568+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
<AVAudioSessionRouteDescription: 0x302c00bc0,
inputs = (
"<AVAudioSessionPortDescription: 0x302c01330, type = MicrophoneBuiltIn; name = iPhone Mikrofon; UID = Built-In Microphone; selectedDataSource = (null)>"
);
outputs = (
"<AVAudioSessionPortDescription: 0x302c004d0, type = Receiver; name = Vev\U0151; UID = Built-In Receiver; selectedDataSource = (null)>"
)>
Type: Notice | Timestamp: 2024-08-15 12:20:30.167626+02:00 | Process: Speakerbox | Library: Speakerbox | TID: 0x19540d
With the Live Caller ID example server, the caller lookup dataset is defined in an input.txtpd and processed by running a ConstructDatabase command which creates a block.binpb and an identity.binpb file.
In other words, a static input file is being processed into static block and identity files.
However, in the real world, the data content for identified and blocked numbers is something which is in a constant state of flux and evolution, as new numbers becoming available, old ones become stale, numbers which were initially considered safe change into being considered malicious etc. etc.
Is the example server just that, merely an example using fixed datasets, and an actual production server is able to use live every changing data to formulate its response back to the iPhone OS query?
Here's a concrete use case - suppose it's a requirement to permit US nanp numbers but to block anything else. The total number of non US nanp numbers is so large and ever changing that it would be unfeasible to attempt to capture them in an input.txtpd file and then process that, and then to re-capture and re-process it endlessly. Instead what would be required is the ability for the Live Caller ID server to evaluate at query time, using a regular expressions for example, if a number is nanp or not.
Is this possible?
The documentation for adding images for Live Caller ID specify that they should be in .heic format and be less than 64KB.
However the majority of the time they just don't display.
Mostly they would with iOS 18 beta 4, but with beta 5, 90% of the time they don't display.
Seems there's some other factor at play, such as pixel size of width/height, or resolution density?
Hello
Apps and their extensions are able to communicate with each other by reading/writing data stored in a shared group location.
However this isn't the case with the the Live Caller ID Extension - if data is written to group defaults for example (as opposed to standard defaults) by the app, then that data isn't readable by the Caller ID extension.
This has the consequence that its not possible for a user to dynamically switch which data set the extension connects to.
Consider the use case where the Live Caller ID Server has one data set where callers are not blocked, and another where they are blocked, then the caller id extension can route different requests to different datasets based on the "user tier".
However as the extension can't read data from the shared group then the app can't communicate user preferences to the extension, therefore the switching isn't possible.
Is this by design or due to the immaturity of the feature? If its by design, then it means the use case outlined above isn't possible, and thus greatly reduces the possible functionality of the Live Caller Id feature.
(It would be possible for the app to install multiple extensions, each of which connects to a different data set by specifying a different user tier, but the user having to flip these one and off within the Settings app is a dreadful user experience).
I am developing an app in Flutter and I need to implement iOS CallKit functionality to identify and store caller data within the app. I would like to know how to integrate CallKit into a Flutter application and what steps I need to follow to manage caller identification and data storage.
I have done some research and understand that I will need to write native Swift code to handle CallKit, but I am not sure how to properly integrate it with Flutter and manage communication between the native code and Dart code.
My specific questions are:
How do I set up CallKit in a Flutter project?
What permissions and configurations do I need in the Info.plist file?
How can I receive and handle incoming call data using CallKit?
How can I communicate this data to the Dart code in Flutter to store and display it in the user interface?
Any code examples, tutorials, or advice would be greatly appreciated. Thanks in advance for your help.
I have tried several approaches to implement iOS CallKit in my Flutter app:
Online Forums: I have researched various online forums and discussion boards where developers share their experiences and solutions for integrating CallKit with Flutter. Unfortunately, I have not found a complete solution that addresses my specific requirements.
Pub.dev Packages: I have explored different packages available on pub.dev, such as flutter_callkit_incoming and flutter_callkit, and tried to implement them in my project. However, I have encountered issues with proper integration and communication between the native Swift code and Flutter.
GitHub Repositories: I have also looked at several GitHub repositories where developers have shared their implementations of CallKit in Flutter. While these repositories provided some insights, I was unable to achieve the desired functionality on my physical device.
Despite following the instructions and examples from these sources, I have not been able to get CallKit to work correctly on my physical device. I am expecting to be able to identify the caller, and store the caller data within my Flutter app.
Hello!
I've been maintaining a Cordova-based calling application for years. Cordova uses a webview to show the user interface and has a bridge for calling native iOS API from JavaScript. The application uses CallKit and VoIP push notifications to handle the calling functionality.
Before the iOS 17.5 update, calling worked relatively stable. However, starting from this version and on, Safari instantly force-closes all open websocket connections when the "Answer" button on the calling UI is hit. So, basically, the call ends right after it answers, because in our case websockets are crucial for the SIP negotiation process.
Firstly, I inspected the Safari console, and there is a new red error saying: WebSocket connection to 'wss://home.thirdlane.com/wss' failed: The operation couldn’t be completed. Software caused connection abort.
Secondly, I checked the Xcode logs, and there are several warnings saying Invalidating grant <invalid NS/CF object> failed every time the call is answered.
And, I'm afraid this is all that I have to cling to.
I managed to mitigate the effect by connecting the websockets only when the call is answered and the application is focused. However, this approach has its own drawbacks and doesn't solve all the problematic cases.
I must mention that the behavior has slightly improved in iOS 17.6 – now the websockets are cut after the "Answer" button hit whereas in 17.5.1 they were cut in ~3 seconds after the VoIP push regardless of whether the answer was made.
This looks like a Safari/WebView bug to me and I would like bring it to the Apple's attention. I've never filed a bug before, so hopefully this is the right place to write.
I can provide more logs or vidoes upon request.
Thank you!
Hello
I have a few questions regarding the Live Caller ID lookup feature
First question:
The documentation for Live Caller ID Lookup says that "the system does not use private relay when the application is installed directly from XCode.This allows the application & the service deployment to be tested before filling out the onboarding form and setting up private relay."
What is the situation regarding development distribution signed .ipas? Would they be able to bypass the private relay too?
Second question:
Is there anyway an application could dynamically switch which blocking dataset gets used? The use case for this is providing the option to the user whether a set of numbers gets blocked or not.
If the OS makes a blocking lookup and an identity lookup, then if these always map to the same blocking dataset then it means the blocking behaviour is the same for every user there is.
That means whatever decisions the server makes as to what numbers to block applies to every user. Whether to block a number of not is a fuzzy decision, it would be good if users had the ability to decide for themselves if fuzzy numbers should be blocked or not rather than have that imposed upon them.
Third question:
It looks from the way things are set up that 2) is not actually possible. If that is the case then will it be permitted for two endpoints to be registered with Apple? (then the app could implement more than one Live Caller ID extension which provide different blocking behaviour)?
Thank you very much.
In situations where the app receives a VoIP push, and the user starts to answer by sliding, the call is initiated and the timer starts. However, due to network issues, the app's call may not be fully ready, resulting in a delay of 5-10 seconds before the actual call begins. Is there a way to display a "loading" or "connecting" indicator on the CallKit interface during this wait time?
In the documentation for the example Live Caller ID server (https://swiftpackageindex.com/apple/live-caller-id-lookup-example/main/documentation/pirservice/testinginstructions) there is an example service-config.json. file shown (without thorough documentation).
That config file, and the whole of the instructions, center around there being two datasets of numbers: block and identity.
My question is - is it possible for more than one dataset to be specified i.e. for block1 and block2 to be specified?
The use case for this would be - suppose the Live Caller ID server has a set of numbers it has identified as being nuisance callers and so it lists these in the block section. However user A might want all these nuisance callers to be blocked but user B does not. Therefore the Live Caller ID extension on the handset would need to use a different dataset on the server so that user A's calls from a set of numbers is blocked, but user B's are not.
Note that I'm not suggesting that the Caller ID server should be capable of storing individual user's preferences. All that would be required would be two data sets: one where blocked content is none and and one where blocked content is some. Then a user/app could switch between them as indicated by the user.
Is that possible?
If the database structure and service-config.json etc. is not configured to permit that, then could two different servers be set up to achieve this instead? i.e. so the server url specified in the app's extension can be set at run time and not at compile time?
I've been following the instructions on how to set up Live Caller ID Lookup using the example PIRService.
And I have been successful - I'm managed to get name information and images retrieved and displayed on the call screen, in addition to being able to block numbers via PIRService too.
So while I did get it working, it was, and still is, incredibly painful to do so due to the fact it only works about 1% of the time.
There's two main problems, which look like they're different manifestations of the same issue. The first problem is difficulty enabling the Live CallerID lookup feature via the flip switch in the iPhone's settings, and then the second issue is when this has been enabled, then a phone number's details is being attempted to be retrieved.
There's a lot, a very lot, of timeout issues being reported by CallKit logging i.e.:
configure failed Error Domain=com.apple.CipherML Code=1100 "Unable to query status due to errors: The request timed out." UserInfo={NSLocalizedDescription=Unable to query status due to errors: The request timed out., NSUnderlyingError=0xd98344450 {Error Domain=NSURLErrorDomain Code=-1001 "The request timed out." UserInfo={NSLocalizedDescription=The request timed out., NSErrorFailingURLKey=http://192.168.1.100:8080/issue}}}
When this occurs I can see that the request is getting through to the PIRService as it outputs logging to the Mac console:
2024-07-28T09:33:15-0700 info Hummingbird : [HummingbirdCore] Server started and listening on 0.0.0.0:8080
2024-07-28T09:33:37-0700 info Hummingbird : hb_id=5e0330c893af6a98c20e5100fdb26871 hb_method=GET hb_uri=/.well-known/private-token-issuer-directory [Hummingbird] Request
2024-07-28T09:33:37-0700 info Hummingbird : hb_id=5e0330c893af6a98c20e5100fdb26872 hb_method=GET hb_uri=/token-key-for-user-token [Hummingbird] Request
So it would appear that requests are getting through to PIRService but then something is timing out after that. Could that be the PrivacyPass/Homomorphic Encryption stuff? Or something else?
What could be a cause of this instability, and is there anything that can be done to increase reliability of it?
(Xcode 16 beta 4, iOS 18 developer beta 4, Sonoma 14.5, the iPhone(s) being tested are connected to the Mac via usb cable, running on the same Wifi network).
With iOS 18 there are now five ways for a caller to be blocked/silenced:
the caller can be blocked via the Live Caller ID extension
the caller can be blocked via the Call Kit extension
the caller can be blocked via Block Caller via the call history recents
the call could be silenced via Silence Junk Callers
the call could be silenced via Silence Unknown Callers
These are all totally separate and there's no way of reconciling all of them and presenting a holistic overview and management system to the user.
Call blocking applications have no way of knowing which numbers will be blocked by 3) or silenced by 4) or 5), or even of determining 4) and 5) are enabled.
And iOS doesn't indicate to users what will be blocked by 1) or 2).
Currently users have no way of knowing what's been blocked/silenced where. Neither via call blocking apps nor via the OS.
From users' perspectives its a confusing and frustrating mess.
If the OS exposed which numbers have been blocked via 3) to applications and exposed if Silence Unknown Callers and Silence Junk Callers are enabled then call blocking applications could present a unified way for users to see and manage what's going on with their device holistically.
I've followed the instructions to configure and launch a live caller id test service (https://swiftpackageindex.com/apple/live-caller-id-lookup-example/main/documentation/pirservice/testinginstructions)
i.e. I've constructed a database, built and installed the PIRService etc.
Additionally I have created a test app with a Live Caller ID Extension.
The problem I'm facing is when turning on the Live Caller ID feature on an iPhone (the Settings|Apps|Phone|Call Blocking & Identification|Live CallerID Lookup switch) with iOS 18 Beta 4 is the phone logs:
"The request timed out." UserInfo={NSLocalizedDescription=The request timed out., NSErrorFailingURLKey=http://MacBook-Pro.local:8080/.well-known/private-token-issuer-directory
The configuration notes say:
"When running things locally on your Mac, and your testing device is on the same network, then you can use mDNS to let the device find your Mac. Let’s assume that your Mac’s hostname is Tims-MacBook-Pro.local. Then we should use the following value for the URLs: http://Tims-Macbook-Pro.local:8080. Thanks to the mDNS protocol your device should be able to resolve your hostname to the actual IP address of your Mac and make the connection."
My Mac hostname is "MacBook-Pro" therefore the Live Caller ID Extension is configured as:
LiveCallerIDLookupExtensionContext(
serviceURL: URL(string: "http://MacBook-Pro.local:8080")!,
tokenIssuerURL: URL(string: "http://MacBook-Pro.local:8080")!,
userTierToken: Data(base64Encoded: "BBBB")!
)
And the service-config.json is configured as:
{
"issuerRequestUri": "http://MacBook-Pro.local:8080",
"users": [
<snip>
(I've also tried excluding the issuerRequestUri as the instructions say "This value can be omitted from the configuration. Setting this explicitly will not be required for devices using iOS 18.0 beta 4 or later.")
And the PIR Service is started on the Mac as:
PIRService --hostname 0.0.0.0 service-config.json
And it starts and runs.
The iPhone and Mac are on the same Wifi network and connected by usb cable.
As far as I can tell, everything has been set up in accordance with the Testing Live Caller ID instructions, yet I get the error when attempting to enable the extension on the iPhone.
Is there something missing/incorrectly configured?
If an app has a Live Caller ID Lookup extension and the lookup server indicates that a caller is identified and not blocked, then if the user wished to locally block that number they can do so either via the iPhone call block button, or via the app's Call Extension block list.
However there's apparently no way for the user to do the inverse.
i.e. if Live Caller Id Lookup indicated that a number should be blocked, then how could a user indicate they don't want that number blocked for them?
If they added it to the Call Extension as an identified number, but live lookup is saying the number should be blocked, then what does the OS do? Give priority to the blocking instruction from the live lookup server, or give priority to the fact its in the Call Extension's Identified list?