Skip to content

OllamaEndpointNotFoundError: Ollama call failed with status code 404. Maybe your model is not found and you should pull the model with ollama pull qwen:14b. #20913

Closed Answered by SiyuanLi-Sven
guiniao asked this question in Q&A
Discussion options

You must be logged in to vote

you solved my problem, many thanks. I tried the following code and it works.

from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
from langchain_openai import ChatOpenAI


output_parser = StrOutputParser()

ollama_llm = ChatOpenAI(
    model='llama3.2:latest',
    api_key="ollama", 
    base_url="http://yourDomainOrIP:11434/v1/"
)

prompt = 'say and only say hello'

prompt = ChatPromptTemplate.from_messages([
    ("system", "You are a helpful assistant."),
    ("user", f"{prompt}")
])

chain = prompt | ollama_llm | output_parser

chain.invoke({})

Replies: 2 comments 2 replies

Comment options

You must be logged in to vote
1 reply
@guiniao
Comment options

Comment options

You must be logged in to vote
1 reply
@SiyuanLi-Sven
Comment options

Answer selected by mdrxy
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants