Adding a parameter to OllamaLLM to disable the thinking output. #31573
Closed
dhanesh24g
announced in
Ideas
Replies: 2 comments 2 replies
-
yes it would be great to see this feature |
Beta Was this translation helpful? Give feedback.
0 replies
-
This is now supported by setting |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
The Ollama LLMs are generating blocks by default. There's no straight-forward way to suppress this output. So kindly add a parameter in OllamaLLM to disable the thinking output.
Something like -
Motivation
To make it easier for the langchain_ollama users to work with LLM models.
Proposal (If applicable)
Something like -
Beta Was this translation helpful? Give feedback.
All reactions