docs/how_to/streaming/ #31106
Replies: 1 comment 1 reply
-
I'm using the below code snippet to learn "Working with Input Streams". from langchain.chat_models import init_chat_model
from langchain_core.output_parsers import JsonOutputParser
model = init_chat_model(
model_provider="openai",
model="qwen3:8b",
base_url="http://localhost:11434/v1",
api_key="123456"
)
chain = model | JsonOutputParser()
for text in chain.stream(
"output a list of the countries france, spain and japan and their populations in JSON format. "
'Use a dict with an outer key of "countries" which contains a list of countries. '
"Each country should have the key `name` and `population`"
):
print(text, end="|", flush=True) As you can see, I use Ollama to run qwen3 model. When I run the python code, nothing was printed in the console. However, if I switch to use other models like llama3 or deepseek-r1, I can find get the streaming log printed. Does that mean JsonOutputParser doesn't support streaming for qwen3 model? Thanks. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
docs/how_to/streaming/
This guide assumes familiarity with the following concepts:
https://python.langchain.com/docs/how_to/streaming/
Beta Was this translation helpful? Give feedback.
All reactions