streaming response #28925
Replies: 1 comment
-
I found two similar unsolved discussions that might be relevant to your question:
To integrate your from fastapi import FastAPI, WebSocket
from langchain import LLM # Assuming LLM is the class you're using
app = FastAPI()
@app.websocket("/ws")
async def websocket_endpoint(websocket: WebSocket):
await websocket.accept()
prompt = "Your prompt here" # You can modify this to receive from the client
async for chunk in llm.astream(prompt):
await websocket.send_text(chunk.content) In this setup, the WebSocket connection is established at the To continue talking to Dosu, mention @dosu. Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Other |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
async def stream_to_websocket(prompt:str):
async for chunk in llm.astream(prompt):
yield chunk.content
Beta Was this translation helpful? Give feedback.
All reactions