Flux0 – LLM-framework agnostic infra for LangChain agents with streaming, sessions, and multi-agent support #31871
asaf
announced in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey guys,
We built Flux0, an open framework that lets you build LangChain (or LangGraph) agents with real-time streaming (JSONPatch over SSE), full session context, multi-agent support, and event routing — all without locking you into a specific agent framework.
It’s designed to be the glue around your agent logic:
🧠 Full session and agent modeling
📡 Real-time UI updates (JSONPatch over SSE)
🔁 Multi-agent orchestration and streaming
🧩 Pluggable LLM execution (LangChain, LangGraph, or your own async Python code)
You write the agent logic, and Flux0 handles the surrounding infrastructure: context management, background tasks, streaming output, and persistent sessions.
Think of it as your backend infrastructure for LLM agents — modular, framework-agnostic, and ready to deploy.
→ GitHub: https://github.com/flux0-ai/flux0
Would love feedback from anyone building with LangChain, LangGraph, or exploring multi-agent setups!
Beta Was this translation helpful? Give feedback.
All reactions