Help Needed: Exporting kokoro.onnx to CoreML .mlmodel #2182
Unanswered
abdussamad0
asked this question in
Q&A
Replies: 1 comment 3 replies
-
It depends on onnxruntime. We use onnxruntime inside. Do you happen to know if anyone has managed to run kokoro with coreml? |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
I’m working on deploying the kokoro.onnx model from the Sherpa-ONNX project in an iOS application and have encountered challenges converting it into a Core ML .mlmodel or .mlpackage.
My primary goal is to run the model using Metal on A15+ devices for improved performance and reduced thermal load. Unfortunately, using ONNX directly with onnxruntime and the coreml execution provider has not resulted in any GPU utilization. In practice, it still runs on the CPU, leading to higher inference times, significant heat buildup, and accelerated battery drain.
Here’s what I’ve attempted so far:
• coremltools.convert() with source="pytorch" and source="onnx" → both unsupported or fail with missing framework issues
• onnx_coreml.convert() fails due to deprecated coremltools.converters.nnssa dependencies
• A PyTorch traced model using dummy input tensors → fails Core ML type inference or triggers issues with unsupported ops
Could you please help with:
1. A working method to convert kokoro.onnx (non-quantized) into .mlmodel or .mlpackage format
2. The correct toolchain setup (Python version, coremltools version, any patches)
3. Any known limitations with exporting Kokoro models to Core ML
4. Whether Sherpa-ONNX will offer Core ML native support or Metal-tuned models in the future
Any guidance, best practices, or even a minimal working example would be greatly appreciated. Sherpa-ONNX is a fantastic project and I’m excited to bring its capabilities to iOS—but ideally in the most optimized, user-friendly, and battery-efficient way.
Beta Was this translation helpful? Give feedback.
All reactions