in iOS 15, on stopSpeaking of AVSpeechSynthesizer,
didFinish delegate method getting called instead of didCancel which is working fine in iOS 14 and below version.
General
RSS for tagExplore the power of machine learning within apps. Discuss integrating machine learning features, share best practices, and explore the possibilities for your app.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Adding the openAppWhenRun property to an AppIntent for a ControlWidgetButton causes the following error when the control is tapped in Control Center:
Unknown NSError The operation couldn’t be completed. (LNActionExecutorErrorDomain error 2018.)
Here’s the full ControlWidget and AppIntent code that causes the errorerror:
Should controls be able to open apps after the AppIntent runs, or is this a bug?
Hello,
I‘m using DockKit within my SwiftUI Application with GetStream. Before updating to iOS 18 yesterday the custom Tracking using DockKit worked like a charm, but After updating it stopped working unexpectedly.
What‘s more curious: using the official GetStream Video Calls Application it works on iOS18 still, but Not within my Application. I can confirm, that my iPhone is still paired and I can receive logs about the current docking State and everything seems fine.
Any suggestions what I‘m missing here?
I've been attempting to install tf metal on my computer so that I can use GPUs instead of CPUs. I have tf macOS installed already, and I am fully updated with pip and tf. I'm currently 2 months into building and training a tf CNN, and I'm at the point where training a single epoch for my network will take a week (I have a lot of data that I need to use). I desperately need to use GPUs but am stuck with CPUs for now. I can't get access to a cluster, so the best I can do is continue to use my M2 MacBook. Is there any other way I can install TF metal? Is there a way I can use GPUs (rather than CPUs) when using TF if I can't get install metal?
I keep getting this error message:
"ERROR: Could not find a version that satisfies the requirement tensorflow-metal (from versions: none) ERROR: No matching distribution found for tensorflow-metal"
I looked on apple forums, tried to download it from GitHub (the page is down), and anything else I could think of and/or find on the internet to help, but it still isn't installing.
I've used the following commands and still no luck:
python -m pip install tensorflow-metal
pip install https://github.com/apple/tensorflow_metal/releases/download/v0.5.0/tensorflow_metal-0.5.0-py3-none-any.whl
pip install tensorflow-metal
pip3 install tensorflow-metal
SYSTEM_VERSION_COMPAT=0 python -m pip install tensorflow-metal
SYSTEM_VERSION_COMPAT=0 pip install tensorflow-macos tensorflow-metal
conda install -c anaconda tensorflow-gpu
Any help would be appreciated! Thanks so much!
I am searching for a method to remove background from a video. it can be from camera Session fileOutput url or from photo library.
I was able to accomplish live preview of removed background with the depth data and some metal framework code from the example Enhancing Live Video by Leveraging TrueDepth Camera Data. However I count figure out a way to save this as a video so that I can upload it.
Also this method is using over 150% of cpu ( Xcode cpu usage ), which seems to be quite a lot and the device is getting heated up so fast and drops the frames when It hot.
I also found something similar from GitHub using CoreML example by Dmitry Voitekh which only uses less than 40% cpu.
Any information regarding this will be helpful.
Objective : Remove Background from video and save it
I am opening the Siri shortcut screen from the viewDidLoad method, as follows:
override func viewDidLoad() {
super.viewDidLoad()
// Present the Siri Shortcut screen to add Card Payment Intent
let viewController = INUIAddVoiceShortcutViewController(shortcut: INShortcut(intent: self.cardPaymentIntent)!)
viewController.modalPresentationStyle = .pageSheet
// Setting Delegate
viewController.delegate = self
self.present(viewController, animated: true, completion: nil)
}
// Delegate Method Conformance :: INUIAddVoiceShortcutViewControllerDelegate
@available(iOS 12.0, *)
func addVoiceShortcutViewController(_ controller: INUIAddVoiceShortcutViewController, didFinishWith voiceShortcut: INVoiceShortcut?, error: Error?) {
controller.dismiss(animated: true, completion: nil)
// The issue is here. Whether we add the or Dismiss the Siri shortcut screen without adding it, this delegate gets called.
}
@available(iOS 12.0, *)
func addVoiceShortcutViewControllerDidCancel(_ controller: INUIAddVoiceShortcutViewController) {
controller.dismiss(animated: true, completion: nil)
}
// Card Payment Intent
public var cardPaymentIntent: CardPaymentIntent {
let intent = CardPaymentIntent()
intent.suggestedInvocationPhrase = NSLocalizedString("Pay my credit card", comment: "")
return intent
}
Whenever I present the siri shortcut screen, either I add the shortcut or dismiss the screen without adding. In both cases , the shortcut is added. And this method is called every time
func addVoiceShortcutViewController(_ controller: INUIAddVoiceShortcutViewController, didFinishWith voiceShortcut: INVoiceShortcut?, error: Error?)
Any solution ? while I dismiss the screen, i want it not to be added into the shortcut
Xcode 15.3 AppIntentsSSUTraining warning: missing the definition of locale # variables.1.definitions
Hello!
I've noticed that adding localizations for AppShortcuts triggers the following warnings in Xcode 15.3:
warning: missing the definition of zh-Hans # variables.1.definitions
warning: missing the definition of zh-Hans # variables.2.definitions
This occurs with both legacy strings files and String Catalogs.
Example project: https://github.com/gongzhang/AppShortcutsLocalizationWarningExample
I'm trying to create an App Shortcut so that users can interact with one of my app's features using Siri. I would like to be able to turn this shortcut on or off at runtime using a feature toggle.
Ideally, I would be able to do something like this.
struct MyShortcuts: AppShortcutsProvider {
static var appShortcuts: [AppShortcut] {
// This shortcut is always available
AppShortcut(
intent: AlwaysAvailableIntent(),
phrases: ["Show my always available intent with \(.applicationName)"],
shortTitle: "Always Available Intent",
systemImageName: "infinity"
)
// This shortcut is only available when "myCoolFeature" is available
if FeatureProvider.shared.isAvailable("myCoolFeature") {
AppShortcut(
intent: MyCoolFeatureIntent(),
phrases: ["Show my cool feature in \(.applicationName)"],
shortTitle: "My Cool Feature Intent",
systemImageName: "questionmark"
)
}
}
}
However, this does not work because the existing buildOptional implementation is limited to components of type (any _AppShortcutsContentMarker & _LimitedAvailabilityAppShortcutsContentMarker)?.
All other attempts at making appShortcuts dynamic have resulted in shortcuts not working at all. I've tried:
Creating a makeAppShortcuts method that returns [AppShortcut] and invoking this method from within the appShortcuts
Extending AppShortcutsBuilder to support a buildOptional block isn't restricted to a component type of (any _AppShortcutsContentMarker & _LimitedAvailabilityAppShortcutsContentMarker)?
Extending AppShortcutsBuilder to support buildArray and using compactMap(_:) to return an empty array when the feature is disabled
I haven't used SiriKit before but it appears that shortcut suggestions were set at runtime by invoking setShortcutSuggestions(_:), meaning that what I'm trying to do would be possible. I'm not against using SiriKit if I have to but my understanding is that the App Intents framework is meant to be a replacement for SiriKit, and the prompt within Xcode to replace older custom intents with App Intents indicates that that is indeed the case.
Is there something obvious that I'm just missing or is this simply not possible with the App Intent framework? Is the App Intent framework not meant to replace SiriKit and should I just use that instead?
Hi,
I am working on creating a EntityPropertyQuery for my App entity. I want the user to be able to use Shortcuts to search by a property in a related entity, but I'm struggling with how the syntax for that looks.
I know the documentation for 'EntityPropertyQuery' suggests that this should be possible with a different initializer for the 'QueryProperty' that takes in a 'entityProvider' but I can't figure out how it works.
For e.g. my CJPersonAppEntity has 'emails', which is of type CJEmailAppEntity, which has a property 'emailAddress'. I want the user to be able to find the 'person' by looking up an email address.
When I try to provide this as a Property to filter by, inside CJPersonAppEntityQuery, but I get a syntax error:
static var properties = QueryProperties {
Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in
person.emails // error
}) {
EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) }
ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) }
}
}
The error says "Cannot convert value of type '[CJPersonEmailAppEntity]' to closure result type 'CJPersonEmailAppEntity'"
So it's not expecting an array, but an individual email item. But how do I provide that without running the predicate query that's specified in the closure?
So I tried something like this , just returning something without worrying about correctness:
Property(\CJPersonEmailAppEntity.$emailAddress, entityProvider: { person in
person.emails.first ?? CJPersonEmailAppEntity() // satisfy compiler
}) {
EqualToComparator { NSPredicate(format: "emailAddress == %@", $0) }
ContainsComparator { NSPredicate(format: "emailAddress CONTAINS %@", $0) }
}
and it built the app, but failed on another the step 'Extracting app intents metadata':
error: Entity CJPersonAppEntity does not contain a property named emailAddress. Ensure that the property is wrapped with an @Property property wrapper
So I'm not sure what the correct syntax for handling this case is, and I can't find any other examples of how it's done. Would love some feedback for this.
Hi,
I am working with AppIntents, and have created an OpenIntent with a target using a MyAppContact AppEntity that I have created. This works fine when running from Shortcuts because it pops up a list of options from the 'suggestedEntities` method. But It doesn't work well when using with Siri. It invokes the AppIntent, but keeps repeatedly asking for the value of the 'target' entity, which you can't really pass in with voice.
What's the workaround here? Can an OpenIntent be activated by voice as well?
Hi,
I have a 128Gb 2024 M2 iPad Air. I upgraded to iPadOS 18 Beta 3 & I’m not seeing the option in settings to join the waitlist for the Apple Intelligence? Is this just an option in settings they’re gradually rolling out to devices? I know my iPad is eligible.
Topic:
Machine Learning & AI
SubTopic:
General
The documentation for translationTask(source:target:action:) says it should translate when content appears, but this isn't happening. I’m only able to translate when I manually associate that task with a configuration, and instantiate the configuration.
Here’s the complete source code:
import SwiftUI
import Translation
struct ContentView: View {
@State private var originalText = "The orange fox jumps over the lazy dog"
@State private var translationTaskResult = ""
@State private var translationTaskResult2 = ""
@State private var configuration: TranslationSession.Configuration?
var body: some View {
List {
// THIS DOES NOT WORK
Section {
Text(translationTaskResult)
.translationTask { session in
Task { @MainActor in
do {
let response = try await session.translate(originalText)
translationTaskResult = response.targetText
} catch { print(error) }
}
}
}
// THIS WORKS
Section {
Text(translationTaskResult2)
.translationTask(configuration) { session in
Task { @MainActor in
do {
let response = try await session.translate(originalText)
translationTaskResult2 = response.targetText
} catch { print(error) }
}
}
Button(action: {
if configuration == nil {
configuration = TranslationSession.Configuration()
return
}
configuration?.invalidate()
}) { Text("Translate") }
}
}
}
}
How can I automatically translate a given text when it appears using the new translationTask API?
Topic:
Machine Learning & AI
SubTopic:
General
Controls' actions use App Intents in iOS 18.
However, when executing App Intents from Controls, alert dialogs such as IntentDialog can't be used.
This makes it unclear how to display errors when trying to create Controls that do not launch the app.
Is there any way to handle this?
My application is being supported by both iOS and iPadOS platforms. I would like to add AppIntents only on iOS and not on iPadOS. I have not found any blogs related to it. Is it possible? If so, May I know how can we do it?
If not, what is the best practice to avoid showing Siri shortcuts on iPad.
Hello everybody,
I am running into an error with BNNS.NormalizationLayer. It appears to only work with .vector, and matrix shapes throws layerApplyFail during training. Inference doesn't throw but the output stays the same.
How to correctly use BNNS.NormalizationLayer with matrix shapes? How to debug layerApplyFail exception?
Thanks
let array: [Float32] = [
01, 02, 03, 04, 05, 06,
07, 08, 09, 10, 11, 12,
13, 14, 15, 16, 17, 18,
]
// let inputShape: BNNS.Shape = .vector(6 * 3) // works
let inputShape: BNNS.Shape = .matrixColumnMajor(6, 3)
let input = BNNSNDArrayDescriptor.allocateUninitialized(scalarType: Float32.self, shape: inputShape)
let output = BNNSNDArrayDescriptor.allocateUninitialized(scalarType: Float32.self, shape: inputShape)
let beta = BNNSNDArrayDescriptor.allocate(repeating: Float32(0), shape: inputShape, batchSize: 1)
let gamma = BNNSNDArrayDescriptor.allocate(repeating: Float32(1), shape: inputShape, batchSize: 1)
let activation: BNNS.ActivationFunction = .identity
let layer = BNNS.NormalizationLayer(type: .layer(normalizationAxis: 0), input: input, output: output, beta: beta, gamma: gamma, epsilon: 1e-12, activation: activation)!
let layerInput = BNNSNDArrayDescriptor.allocate(initializingFrom: array, shape: inputShape)
let layerOutput = BNNSNDArrayDescriptor.allocateUninitialized(scalarType: Float32.self, shape: inputShape)
// try layer.apply(batchSize: 1, input: layerInput, output: layerOutput, for: .inference) // No throw
try layer.apply(batchSize: 1, input: layerInput, output: layerOutput, for: .training)
_ = layerOutput.makeArray(of: Float32.self) // All zeros when .inference
Hi there,I’m a Computer Science student and I have a MacBook Pro 2019 and I’m thinking in buying a new Mac either a Mac Studio or a MacBook Pro but I want to use it for ML.
I’m now doing a segmentation model and I’m wondering if I could use Core Ml or the Apple Neural Engine in the new M3 chips to train it, I’m now using colab and tensorflow to create the model but it’s not doing the job, I’m falling short of Cuda memory.
Thanks :)
Topic:
Machine Learning & AI
SubTopic:
General
Hi, all.
I've been writing various computational functions using Metal.
However, in the following operation functions, unlike + and *, there is an accuracy issue in the / operation.
This is a function that divides a matrix of shape [n, x, y] and a scalar [1].
When compared to numpy or torch, if I change the operator of the above function to * or + instead of /, I can get completely the same results, but in the case of /, there is a difference in the mean of more than 1e-5.
(For reference, this was written with reference to the metal kernel code in llama.cpp)
kernel void kernel_div_single_f16(
device const half * src0,
device const half * src1,
device half * dst,
constant int64_t & ne00,
constant int64_t & ne01,
constant int64_t & ne02,
constant int64_t & ne03,
uint3 tgpig[[threadgroup_position_in_grid]],
uint3 tpitg[[thread_position_in_threadgroup]],
uint3 ntg[[threads_per_threadgroup]]) {
const int64_t i03 = tgpig.z;
const int64_t i02 = tgpig.y;
const int64_t i01 = tgpig.x;
const uint offset = i03*ne02*ne01*ne00 + i02*ne01*ne00 + i01*ne00;
for (int i0 = tpitg.x; i0 < ne00; i0 += ntg.x) {
dst[offset + i0] = src0[offset+i0] / *src1;
}
}
My mac book is,
Macbork Pro(16, 2021) / macOS 12.5 / Apple M1 Pro.
Are there any issues related to Div? Thanks in advance for your reply.