Replies: 1 comment
-
Implemented! Make sure you're on the latest Ollama binary. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Checked
Feature request
Support for streaming responses with tool calling in
langchain_ollama
.Ollama v0.8.0 recently added native support for tool calling along with streaming responses:
→ https://github.com/ollama/ollama/releases/tag/v0.8.0
However, the current
langchain_ollama
integration does not yet support this functionality.It would be great if
langchain_ollama
could support tool calling and streaming output at the same time, enabling more responsive and interactive agent behaviors in LangChain.Motivation
I'm building a tool-using agent that relies on streaming output for responsiveness. While Ollama now supports tool calling with streaming, I can't take advantage of this in LangChain due to the current limitations in
langchain_ollama
.Streaming output dramatically improves UX in chat UIs, and combining this with tool use (e.g., function-calling or retrieval plugins) is becoming essential.
This issue might be related to or blocked by the current implementation of the
Ollama
integration — if there's already ongoing work or a plan for this, I'd appreciate any update or guidance. Happy to help with testing or a potential PR!Proposal (If applicable)
No response
Beta Was this translation helpful? Give feedback.
All reactions