Skip to content

Commit 3f3174e

Browse files
authored
Update READMEs (#103)
* docs: update README files with detailed usage and configuration instructions for @pgflow/cli and @pgflow/dsl * docs: update README with new project overview, usage examples, and build instructions * chore: update changeset to document patch version bumps for multiple packages * docs: update README with project overview, features, usage examples, and resources * style: fix markdown formatting in README to correct blockquote and line breaks * docs: update README to improve formatting, clarify system overview, and remove deprecated example - Changed heading styles for consistency - Rephrased overview paragraph for clarity - Updated package links with markdown syntax - Removed outdated TypeScript DAG example from documentation - Added a new logo image file to enhance branding
1 parent 098c6a5 commit 3f3174e

File tree

6 files changed

+325
-47
lines changed

6 files changed

+325
-47
lines changed

.changeset/sour-pandas-rhyme.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
---
2+
'@pgflow/edge-worker': patch
3+
'pgflow': patch
4+
'@pgflow/dsl': patch
5+
---
6+
7+
Update the README's

README.md

Lines changed: 62 additions & 18 deletions
Original file line numberDiff line numberDiff line change
@@ -1,26 +1,70 @@
1-
# pgflow
1+
<p align="center"><a href="https://pgflow.dev" target="_blank" rel="noopener noreferrer"><img src="logo.png?raw=true" alt="pgflow logo"></a></p>
22

3-
Postgres-centric workflow engine with deep integration with Supabase
3+
### Where complex becomes clockwork.
44

5-
#### 🌐 check docs at [pgflow.dev](https://pgflow.dev)
5+
> "Things just happen. What the hell. And the reason things just happen is that a hundred billion other things just happened, all working unheeded and unseen, to make sure that they do."
6+
>
7+
> –- Terry Pratchett, Last Continent, reflecting on the elegant machinery of complex systems
68
7-
> [!NOTE]
8-
> This project and all its components are licensed under [Apache 2.0](./LICENSE) license.
9+
## Overview
10+
11+
pgflow is a workflow orchestration system that runs directly in your Postgres database - ideal for building reliable AI workflows, background jobs, and data pipelines on Supabase without external services.
12+
13+
The system combines:
14+
15+
- **[SQL Core](./pkgs/core/)** - Workflow state management natively in Postgres with ACID compliance
16+
- **[TypeScript DSL](./pkgs/dsl/)** - Type-safe workflow definitions with automatic inference
17+
- **[Edge Worker](./pkgs/edge-worker/)** - Auto-respawning task processor that handles retries and concurrency
18+
- **[CLI Tools](./pkgs/cli/)** - One-command setup with automatic schema migrations
19+
20+
## Documentation
21+
22+
The pgflow documentation is [available on pgflow.dev](https://pgflow.dev).
23+
24+
## Getting help
25+
26+
File an issue on [GitHub](https://github.com/pgflow-dev/pgflow/issues/new) or join our [Discord](https://discord.gg/NpffdEyb).
27+
28+
## Why pgflow?
929

10-
## Monorepo
30+
When you need more than just isolated background jobs, but don't want the complexity of external orchestration systems:
1131

12-
This repository is a monorepo containing components of pgflow.
13-
Packages live in `pkgs/`
32+
- **Postgres as the Single Source of Truth** - All definitions, state, and history in your database
33+
- **Zero Infrastructure** - No external services, dashboards, or control planes
34+
- **Type-Safe Workflows** - Full compile-time safety between workflow steps
35+
- **Reliable Background Jobs** - Automatic retries with backoff and observability
1436

15-
| Package | Description |
16-
| -------------------------------------- | ------------------------------------------------------------------------------------- |
17-
| [cli](./pkgs/cli/) | Command-line interface for installing pgflow and compiling flows |
18-
| [core](./pkgs/core/) | SQL Core for the workflow engine - foundational part of **pgflow** stack |
19-
| [dsl](./pkgs/dsl/) | Flow DSL - the TypeScript library used to define flows and their handlers |
20-
| [edge-worker](./pkgs/edge-worker/) | An auto-restarting task queue worker implemented for Supabase Edge Functions and PGMQ |
21-
| [website](./pkgs/website/) | Documentation Site |
22-
| [example-flows](./pkgs/example-flows/) | Small package containing various example flows, mainly for exploration |
37+
## What can you build?
2338

24-
## NX Readme
39+
- **AI Workflows** - Chain LLMs, scrape data, reason across tools, and handle failures
40+
- **Background Jobs** - Process emails, files, and scheduled tasks with full visibility
41+
- **Data Pipelines** - Extract, transform, and load data with built-in dependency handling
2542

26-
See [NX_README.md](./NX_README.md) for more information.
43+
## How pgflow works
44+
45+
1. **Define workflows using TypeScript DSL**
46+
2. **Compile them to SQL migrations**
47+
3. **Deploy as Supabase Edge Functions**
48+
4. **Trigger workflows from your app or SQL**
49+
50+
The execution system handles the rest - scheduling steps when dependencies complete, retrying failed tasks, and aggregating results automatically.
51+
52+
## Packages
53+
54+
| Package | Description |
55+
| -------------------------------------- | ----------------------------------------------------------------------- |
56+
| [cli](./pkgs/cli/) | Command-line interface for installing and compiling flows |
57+
| [core](./pkgs/core/) | SQL Core for the workflow engine - foundational tables and functions |
58+
| [dsl](./pkgs/dsl/) | TypeScript DSL for defining flows with type inference |
59+
| [edge-worker](./pkgs/edge-worker/) | Task queue worker for Supabase Edge Functions with reliability features |
60+
| [website](./pkgs/website/) | Documentation site |
61+
| [example-flows](./pkgs/example-flows/) | Example workflow definitions |
62+
63+
## Resources
64+
65+
- 📖 **Documentation**: [pgflow.dev](https://pgflow.dev)
66+
- 🚀 **Demo**: [pgflow-demo.netlify.app](https://pgflow-demo.netlify.app)
67+
- 🛠️ **Getting Started**: [pgflow.dev/getting-started](https://pgflow.dev/getting-started)
68+
69+
> [!NOTE]
70+
> This project and all its components are licensed under [Apache 2.0](./LICENSE) license.

logo.png

392 KB
Loading

pkgs/cli/README.md

Lines changed: 86 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,97 @@
1-
# cli
1+
# @pgflow/cli
2+
3+
The Command Line Interface for pgflow - a PostgreSQL-native workflow engine.
24

35
> [!NOTE]
46
> This project and all its components are licensed under [Apache 2.0](./LICENSE) license.
57
8+
## Overview
9+
10+
`@pgflow/cli` provides essential tools for setting up, managing, and deploying pgflow workflows in your Supabase environment. The CLI handles:
11+
12+
- Installing pgflow in your Supabase project
13+
- Compiling TypeScript workflow definitions into SQL migrations
14+
- Managing workflow deployment and updates
15+
16+
## Prerequisites
17+
18+
- Supabase CLI v2.0.2 or higher
19+
- Deno v1.45.x or higher (for flow compilation)
20+
- Local Supabase project initialized
21+
22+
## Installation
23+
24+
### Via npx (recommended)
25+
26+
```bash
27+
# Run commands directly
28+
npx pgflow@latest <command>
29+
```
30+
31+
### Global installation
32+
33+
```bash
34+
# Install globally
35+
npm install -g pgflow
36+
37+
# Run commands
38+
pgflow <command>
39+
```
40+
41+
## Commands
42+
43+
### Install pgflow
44+
45+
Set up pgflow in your Supabase project with a single command:
46+
47+
```bash
48+
npx pgflow@latest install
49+
```
50+
51+
Options:
52+
53+
- `--supabase-path <path>` - Specify custom Supabase directory path
54+
- `--yes` or `-y` - Skip confirmation prompts (non-interactive mode)
55+
56+
The installer will:
57+
58+
1. Update `config.toml` to enable required connection pooling
59+
2. Copy pgflow SQL migrations to your project
60+
3. Configure environment variables for Edge Functions
61+
4. Guide you through applying migrations
62+
63+
### Compile Flow Definition
64+
65+
Convert a TypeScript flow definition into a SQL migration:
66+
67+
```bash
68+
npx pgflow@latest compile supabase/functions/_flows/my_flow.ts
69+
```
70+
71+
Options:
72+
73+
- `--deno-json <path>` - Path to custom deno.json with import map
74+
- `--supabase-path <path>` - Path to custom Supabase directory
75+
76+
The compiler will:
77+
78+
1. Parse your TypeScript flow definition
79+
2. Extract step dependencies and configuration
80+
3. Generate SQL commands for database registration
81+
4. Create a timestamped migration file in your migrations folder
82+
683
## Building
784

885
Run `nx build cli` to build the library.
986

1087
## Running unit tests
1188

1289
Run `nx test cli` to execute the unit tests via [Vitest](https://vitest.dev/).
90+
91+
## Documentation
92+
93+
For detailed documentation, visit:
94+
95+
- [Installation Guide](https://pgflow.dev/getting-started/install-pgflow/)
96+
- [Compiling Flows](https://pgflow.dev/getting-started/compile-to-sql/)
97+
- [Running Flows](https://pgflow.dev/getting-started/run-flow/)

pkgs/dsl/README.md

Lines changed: 112 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,12 +1,123 @@
1-
# dsl
1+
# @pgflow/dsl
2+
3+
The TypeScript Domain Specific Language (DSL) for defining type-safe workflow definitions in PgFlow.
24

35
> [!NOTE]
46
> This project and all its components are licensed under [Apache 2.0](./LICENSE) license.
57
8+
## Overview
9+
10+
`@pgflow/dsl` provides a type-safe, fluent interface for defining data-driven workflows with explicit dependencies. The DSL ensures that data flows correctly between steps and maintains type safety throughout the entire workflow definition.
11+
12+
Key features:
13+
14+
- **Type Safety** - Complete TypeScript type checking from flow inputs to outputs
15+
- **Fluent Interface** - Chainable method calls for defining steps and dependencies
16+
- **Functional Approach** - Clean separation between task implementation and flow orchestration
17+
- **JSON-Compatible** - All inputs and outputs are JSON-serializable for database storage
18+
- **Immutable Flow Definitions** - Each step operation returns a new Flow instance
19+
20+
## Usage
21+
22+
### Basic Example
23+
24+
```typescript
25+
import { Flow } from '@pgflow/dsl';
26+
27+
// Define input type for the flow
28+
type Input = {
29+
url: string;
30+
};
31+
32+
// Define a flow with steps and dependencies
33+
export const AnalyzeWebsite = new Flow<Input>({
34+
slug: 'analyze_website',
35+
maxAttempts: 3,
36+
baseDelay: 5,
37+
timeout: 10,
38+
})
39+
.step(
40+
{ slug: 'website' },
41+
async (input) => await scrapeWebsite(input.run.url)
42+
)
43+
.step(
44+
{ slug: 'sentiment', dependsOn: ['website'] },
45+
async (input) => await analyzeSentiment(input.website.content)
46+
)
47+
.step(
48+
{ slug: 'summary', dependsOn: ['website'] },
49+
async (input) => await summarizeWithAI(input.website.content)
50+
)
51+
.step(
52+
{ slug: 'saveToDb', dependsOn: ['sentiment', 'summary'] },
53+
async (input) => {
54+
return await saveToDb({
55+
websiteUrl: input.run.url,
56+
sentiment: input.sentiment.score,
57+
summary: input.summary.aiSummary,
58+
});
59+
}
60+
);
61+
```
62+
63+
### Understanding Data Flow
64+
65+
In PgFlow, each step receives an `input` object that contains:
66+
67+
1. **`input.run`** - The original flow input (available to all steps)
68+
2. **`input.{stepName}`** - Outputs from dependency steps
69+
70+
This design ensures:
71+
- Original flow parameters are accessible throughout the entire flow
72+
- Data doesn't need to be manually forwarded through intermediate steps
73+
- Steps can combine original input with processed data from previous steps
74+
75+
### Flow Configuration
76+
77+
Configure flows and steps with runtime options:
78+
79+
```typescript
80+
new Flow<Input>({
81+
slug: 'my_flow', // Required: Unique flow identifier
82+
maxAttempts: 3, // Optional: Maximum retry attempts (default: 1)
83+
baseDelay: 5, // Optional: Base delay in seconds for retries (default: 1)
84+
timeout: 10, // Optional: Task timeout in seconds (default: 30)
85+
})
86+
```
87+
88+
## Compiling Flows
89+
90+
Use the `compileFlow` utility to convert a flow definition into SQL statements:
91+
92+
```typescript
93+
import { compileFlow } from '@pgflow/dsl';
94+
95+
const sqlStatements = compileFlow(MyFlow);
96+
console.log(sqlStatements.join('\n'));
97+
```
98+
99+
Alternatively, use the PgFlow CLI to compile flows directly to migration files:
100+
101+
```bash
102+
npx pgflow compile path/to/flow.ts
103+
```
104+
105+
## Requirements
106+
107+
- All step inputs and outputs MUST be JSON-serializable
108+
- Use only: primitive types, plain objects, and arrays
109+
- Convert dates to ISO strings (`new Date().toISOString()`)
110+
- Avoid: class instances, functions, symbols, undefined values, and circular references
111+
6112
## Building
7113

8114
Run `nx build dsl` to build the library.
9115

10116
## Running unit tests
11117

12118
Run `nx test dsl` to execute the unit tests via [Vitest](https://vitest.dev/).
119+
120+
## Documentation
121+
122+
For detailed documentation on the Flow DSL, visit:
123+
- [Understanding the Flow DSL](https://pgflow.dev/explanations/flow-dsl/)

0 commit comments

Comments
 (0)