Replies: 2 comments
-
Hello, @480284856! I'm here to assist you with any bugs, questions, or contributions. Let's tackle this together while waiting for a human maintainer. The To resolve this issue, ensure that the server hosting the Ollama model is running and accessible at You can verify if the Ollama model server is running and accessible by using the following curl http://localhost:11434/api/generate -d '{
"model": "llama3",
"prompt": "Why is the sky blue?"
}' If the server is running and accessible, you should receive a response from the server. Additionally, you can run If the server is not running, you can start it with the following command: OLLAMA_NUM_PARALLEL=4 OLLAMA_MAX_LOADED_MODELS=4 ollama serve Make sure to replace |
Beta Was this translation helpful? Give feedback.
-
solved by using the latest ollama image. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
I'm stuck at the code of vector store creation in the official document.
The output error information is:
It seems that ollama with version of 0.3.1 can't work well with
ollama.embed
:Code for reproducing:
and the output is:
It seems similar with the error before.
We can also experiment with the code:
and then the output is
404 page not found
.System Info
Version of packages:
Extra info:The version of ollama in linux system is 0.1.45.
Beta Was this translation helpful? Give feedback.
All reactions