Replies: 2 comments 1 reply
-
@ensean did you tried installing any qwen model from the model gallery instead? You can also run any models in the gallery ( https://models.localai.io ) with:
For instance to run qwen2:
Check also: https://localai.io/docs/getting-started/models/#examples |
Beta Was this translation helpful? Give feedback.
1 reply
-
It is fault to set the backend to
|
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
LocalAI version:
Environment, CPU architecture, OS, and Version:
Linux ip-172-31-24-204 6.5.0-1024-aws #24~22.04.1-Ubuntu SMP Thu Jul 18 10:43:12 UTC 2024 x86_64 x86_64 x86_64 GNU/Linux
Describe the bug
Planning to deploy
qwen2
embedding model, test with the flowing request:Ok, this seems to be some error about the hugging face model repo, so I downloaded the gguf file, placed it in the
models
directoryI have even created a folder
sentence-transformers
and move the gguf file into it, but the error is the same...To Reproduce
Expected behavior
Logs
Additional context
Beta Was this translation helpful? Give feedback.
All reactions