Replies: 1 comment
-
It seems the note was passed correctly. You can click BufferWindowMemory and find the messages and confirm it the previous note is in it. It should be, given the screenshot. My biggest suspicion is your context window setting in Ollama. Can you refer to https://github.com/logancyang/obsidian-copilot/blob/master/local_copilot.md#ollama and make sure you set the context window correctly. I recommend Mistral models with a 32k window if your machine has enough RAM. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Running Ollama with llama2:13b
When using "Chat" mode to ask question about the active note. It will not working as expected.
Attached with the screen capture.
Is it the problem in Ollama side or related the input context length?
INSTALL.md
Beta Was this translation helpful? Give feedback.
All reactions