Using ramalama run --rag
makes the model unable to answer things outside the RAG
#1730
Replies: 3 comments
-
Well noticed "rag" and "run" prompts are different @bmahabirbu has got this on his radar... @bmahabirbu is well equipped to comment on a lot of these queries |
Beta Was this translation helpful? Give feedback.
-
Hi @jlebon thanks for the questions hopefully my explanation's help!
An interesting thing to note is that right now model will output 'i dont know' depending on if the question doesn't relate enough to the document. But ive noticed instances where certain models couldnt find the answer even when asked a semi relevant question about the document. Maybe instead it should output 'I couldnt find the answer in the document' to be more clear! |
Beta Was this translation helpful? Give feedback.
-
Converting to a discussion, Not sure if this gets better if we move to llama-stack and MCP communication with the RAG database. |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Issue Description
When running with a RAG,
>
)Steps to reproduce the issue
Without RAG:
With RAG:
Describe the results you received
It doesn't know how to answer questions it previously did when run without the RAG.
Describe the results you expected
It knows how to answer questions it previously did when run without the RAG.
ramalama info output
Upstream Latest Release
No
Additional environment details
No response
Additional information
Unrelated: you need to update
ramalama/.github/ISSUE_TEMPLATE/bug_report.yaml
Line 50 in e1f8fb8
Beta Was this translation helpful? Give feedback.
All reactions