This repository contains example implementations and usage patterns for the AI Agent Flow framework (available on npm). Each example demonstrates different features and capabilities of the framework.
The package is published to npm as ai-agent-flow-examples
.
Create src/quick-start.ts
with a minimal flow:
import { Flow, Runner } from 'ai-agent-flow';
import { ActionNode } from 'ai-agent-flow/nodes/action';
const helloNode = new ActionNode('start', async () => ({
type: 'success',
output: 'Hello world'
}));
const flow = new Flow('quick-start')
.addNode(helloNode)
.setStartNode('start');
const context = { conversationHistory: [], data: {}, metadata: {} };
new Runner().runFlow(flow, context).then(console.log);
Then run it with:
npm install
npm run example examples/quick-start.ts
See the Quick Start section for a minimal example.
Located in src/observability/
, this example demonstrates how to implement observability in your AI Agent Flow applications:
- Structured logging with
winston
- Metrics collection with
prom-client
/metrics
endpoint for Prometheus scraping
graph TD
A[Flow Execution] --> B[Winston Logger]
A --> C[Prometheus Metrics]
B --> D[Structured Logs]
C --> E[/metrics Endpoint]
E --> F[Prometheus Server]
To run the observability example:
- Install dependencies:
npm install
- Run:
npm start
- Access metrics at
http://localhost:9100/metrics
npm start
executes this observability example. Use npm run example <path>
for the other examples.
Programmatic usage:
import { start } from './src/observability/index';
const { server, runPromise } = start();
await runPromise;
server.close();
Located in src/plugin-system/
, this example shows how to extend the AI Agent Flow framework with custom components:
- Custom node implementations
- Plugin registration and management
- Framework extension patterns
graph LR
A[Core Framework] --> B[Plugin Registry]
B --> C[Custom Node 1]
B --> D[Custom Node 2]
B --> E[Custom Node N]
C --> F[Flow Execution]
D --> F
E --> F
Run with:
npm run example src/plugin-system/index.ts
Located in src/streaming/
, this example demonstrates how to handle streaming responses from OpenAI:
- Stream responses from OpenAI using LLMNode
- Handle partial updates with
.onUpdate()
- Process streamed content in real-time
graph TD
A[Flow] --> B[LLMNode]
B --> C[Stream Handler]
C --> D[Console Output]
B --> E[OpenAI API]
E --> F[Streamed Response]
To run the streaming example:
- Set up your OpenAI API key in
.env
- Run:
npm run example src/streaming/index.ts
Located in src/advanced/
, this example shows conditional branching with a DecisionNode
.
Run with:
npm run example src/advanced/index.ts
Located in src/chatbot/
, this is a tiny chatbot that tracks conversation history.
Run with:
npm run example src/chatbot/index.ts
Located in src/data-pipeline/
, this example processes items in batches using BatchNode
.
Run with:
npm run example src/data-pipeline/index.ts
Located in src/debug-ui/
, this example attaches an update handler for debugging flows.
Run with:
npm run example src/debug-ui/index.ts
Located in src/express-server/
, this example exposes a flow via an Express endpoint.
Run with:
npm run example -e "import { startServer } from './src/express-server/index.ts'; startServer();"
Located in src/memory-store/
, this example stores the flow context in memory between runs.
Run with:
npm run example src/memory-store/index.ts
Located in src/multi-agent/
, this example demonstrates agent communication via MessageBus
.
Run with:
npm run example src/multi-agent/index.ts
Located in src/multi-flow/
, this example runs multiple flows concurrently.
Run with:
npm run example src/multi-flow/index.ts
Located in src/tool-calls/
, this example demonstrates how to
invoke a custom tool within a flow.
Run with:
npm run example src/tool-calls/index.ts
Located in src/http-request/
, this example fetches JSON from an API using HttpNode
.
Run with:
npm run example src/http-request/index.ts
Located in src/interactive-cli/
, this example runs a conversational loop in your terminal.
Run with:
npm run example src/interactive-cli/index.ts
graph TD
A[AI Agent Flow] --> B[Examples]
B --> C[Observability]
B --> D[Plugin System]
B --> E[Streaming]
B --> K[Tool Calls]
C --> F[Logging]
C --> G[Metrics]
D --> H[Custom Nodes]
D --> I[Plugin Registry]
E --> J[Real-time Updates]
- Copy
.env.example
to.env
and set the required values - Install dependencies:
npm install
- Run:
npm start
(runs the observability example) - Test:
npm test
npm start
: Run the observability examplenpm run build
: Compile TypeScriptnpm run lint
: Run ESLintnpm run test
: Run testsnpm run format
: Run Prettier
Running npm test
executes a Mocha suite that spawns each short-lived example
using npm run example
. The tests assert that these scripts exit with status 0
,
so failures indicate an example crashed or threw an exception. Examples that run
servers, such as the observability and Express server demos, are excluded from
the test suite.
You can inspect any example without running it by using the CLI that ships with
ai-agent-flow
:
npx aaflow inspect src/observability/index.ts
The command prints information about the flow such as its ID and the nodes it contains. A successful run looks similar to:
✓ Loaded flow from src/observability/index.ts
Nodes:
- greet (ActionNode)
If npx
cannot find the aaflow
command or you see an E404
error, make sure
dependencies are installed (npm install
) and that node_modules/.bin
is in
your PATH
.
Feel free to contribute by:
- Creating new examples
- Improving existing examples
- Adding documentation
- Submitting bug reports
For setup instructions, style commands, and pull request guidelines, see CONTRIBUTING.md.
This project is licensed under the MIT License - see the LICENSE file for details.