Challenges and Help #91
Replies: 2 comments
-
@philippzagar Would be able to provide some more context on this. |
Beta Was this translation helpful? Give feedback.
-
Hi @shimjunho, Thank you for reaching out and for integrating a custom model into SpeziLLM! It’s great to see you’re working with Korean language support using a Hugging Face MLX model 🚀 You’re correct that SpeziLLMLocal allows using custom models from Hugging Face via the Here’s how to properly integrate and run your the EXAONE model in SpeziLLM: Step 1: Download the Model Using
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
What Stanford Spezi module is your challenge related to?
Spezi
Description
Hi i tried to install a new model on speziLLM and that is https://huggingface.co/mlx-community/EXAONE-Deep-2.4B-4bit

I heard that we can use MLX model on this code and i chose this one...however, when I wrote custom id, i got this result like a pic
I am korean so i really wanna use a model which can support korean well
Can you help me how to add a new model?
Reproduction
import Foundation
import SpeziLLMLocal
enum ModelChoice: String, CaseIterable, Identifiable {
// 현재 정의된 케이스 이름들
case llama1B = "Llama 3.2 - 1B"
case llama3B = "Llama 3.2 - 3B"
case llama8B = "Llama 3 - 8B"
case phi3 = "Phi-3 Mini"
case gemma2B = "Gemma 2B"
case smolLM_135M = "SmolLM 135M"
case mistral7B = "Mistral 7B"
case deepseek_qwen_1_5B = "DeepSeek-R1-Qwen-1.5B"
case deepseek_llama_8B = "DeepSeek-R1-Llama-8B"
case exaoneDeep = "EXAONE-Deep-2.4B" // 추가된 모델
How can i fix it?
Expected behavior
I really wanna know how to add a new model here
Additional context
No response
Code of Conduct
Beta Was this translation helpful? Give feedback.
All reactions