|
17 | 17 |
|
18 | 18 | BaseAI is the AI framework for building declarative and composable AI-powered LLM products. It allows you to develop AI agent pipes on your local machine with integrated agentic tools and memory (RAG). Visit our [learn](https://baseai.dev/learn) guide to get started with BaseAI.
|
19 | 19 |
|
| 20 | +### 1. Initialize a new BaseAI project |
| 21 | + |
| 22 | +BaseAI is a TypeScript-first framework. To create a new BaseAI project, run the following command in your project: |
| 23 | + |
| 24 | +```bash |
| 25 | +npx baseai@latest init |
| 26 | +``` |
| 27 | + |
| 28 | +This command will create a `baseai` directory in your project. This is what the directory structure looks like: |
| 29 | + |
| 30 | +``` |
| 31 | +ROOT (of your app) |
| 32 | +├── baseai |
| 33 | +| ├── baseai.config.ts |
| 34 | +| ├── memory |
| 35 | +| ├── pipes |
| 36 | +| └── tools |
| 37 | +├── .env (your env file) |
| 38 | +└── package.json |
| 39 | +``` |
| 40 | + |
| 41 | +### 2. Add API keys |
| 42 | + |
| 43 | +Copy the following in your `.env` file and add appropriate LLM API keys: |
| 44 | + |
| 45 | +``` |
| 46 | +# !! SERVER SIDE ONLY !! |
| 47 | +# Keep all your API keys secret — use only on the server side. |
| 48 | +
|
| 49 | +# TODO: ADD: Both in your production and local env files. |
| 50 | +# Langbase API key for your User or Org account. |
| 51 | +# How to get this API key https://langbase.com/docs/api-reference/api-keys |
| 52 | +LANGBASE_API_KEY= |
| 53 | +
|
| 54 | +# TODO: ADD: LOCAL ONLY. Add only to local env files. |
| 55 | +# Following keys are needed for local pipe runs. For providers you are using. |
| 56 | +# For Langbase, please add the key to your LLM keysets. |
| 57 | +# Read more: Langbase LLM Keysets https://langbase.com/docs/features/keysets |
| 58 | +OPENAI_API_KEY= |
| 59 | +ANTHROPIC_API_KEY= |
| 60 | +COHERE_API_KEY= |
| 61 | +FIREWORKS_API_KEY= |
| 62 | +GOOGLE_API_KEY= |
| 63 | +GROQ_API_KEY= |
| 64 | +MISTRAL_API_KEY= |
| 65 | +PERPLEXITY_API_KEY= |
| 66 | +TOGETHER_API_KEY= |
| 67 | +XAI_API_KEY= |
| 68 | +``` |
| 69 | + |
| 70 | +### 3. Create a new AI agent |
| 71 | + |
| 72 | +Pipe is your custom-built AI agent as an API. It's the fastest way to ship AI features/apps. Let's create a new pipe: |
| 73 | + |
| 74 | +```bash |
| 75 | +npx baseai@latest pipe |
| 76 | +``` |
| 77 | + |
| 78 | +It will ask you for the name, description, and other details of the pipe step-by-step. Once done, a pipe will be created inside the `/baseai/pipes` directory. You can now edit the system prompt, change model params, and more. Here is what a pipe code looks like: |
| 79 | + |
| 80 | +```ts |
| 81 | +import { PipeI } from '@baseai/core'; |
| 82 | + |
| 83 | +const pipeSummary = (): PipeI => ({ |
| 84 | + // Replace with your API key https://langbase.com/docs/api-reference/api-keys |
| 85 | + apiKey: process.env.LANGBASE_API_KEY!, |
| 86 | + name: 'summary', |
| 87 | + description: 'AI Summary agent', |
| 88 | + status: 'public', |
| 89 | + model: 'openai:gpt-4o-mini', |
| 90 | + stream: true, |
| 91 | + json: false, |
| 92 | + store: true, |
| 93 | + moderate: true, |
| 94 | + top_p: 1, |
| 95 | + max_tokens: 1000, |
| 96 | + temperature: 0.7, |
| 97 | + presence_penalty: 1, |
| 98 | + frequency_penalty: 1, |
| 99 | + stop: [], |
| 100 | + tool_choice: 'auto', |
| 101 | + parallel_tool_calls: true, |
| 102 | + messages: [ |
| 103 | + { |
| 104 | + role: 'system', |
| 105 | + content: `You are a helpful AI agent. Make everything Less wordy.` |
| 106 | + } |
| 107 | + ], |
| 108 | + variables: [], |
| 109 | + memory: [], |
| 110 | + tools: [] |
| 111 | +}); |
| 112 | + |
| 113 | +export default pipeSummary; |
| 114 | +``` |
| 115 | + |
| 116 | +### 4. Integrate pipe in your app |
| 117 | + |
| 118 | +Let's create a new `index.ts` file in your project root. Now we need to do the following: |
| 119 | + |
| 120 | +1. Import the pipe config we created. |
| 121 | +2. Create a new pipe instance with the pipe config. |
| 122 | +3. Run the pipe with a user message. |
| 123 | +4. Listen to the stream events. |
| 124 | + |
| 125 | +Here is what the code looks like: |
| 126 | + |
| 127 | +```ts |
| 128 | +import { Pipe, getRunner } from '@baseai/core'; |
| 129 | +import pipeSummarizer from './baseai/pipes/summary'; |
| 130 | + |
| 131 | +const pipe = new Pipe(pipeSummarizer()); |
| 132 | + |
| 133 | +const userMsg = ` |
| 134 | +Langbase studio is your playground to build, collaborate, and deploy AI. It allows you to experiment with your pipes in real-time, with real data, store messages, version your prompts, and truly helps you take your idea from building prototypes to deployed in production with LLMOps on usage, cost, and quality. |
| 135 | +A complete AI developers platform. |
| 136 | +- Collaborate: Invite all team members to collaborate on the pipe. Build AI together. |
| 137 | +- Developers & Stakeholders: All your R&D team, engineering, product, GTM (marketing and sales), literally invlove every stakeholder can collaborate on the same pipe. It's like a powerful version of GitHub x Google Docs for AI. A complete AI developers platform. |
| 138 | +`; |
| 139 | + |
| 140 | +async function main() { |
| 141 | + const { stream } = await pipe.run({ |
| 142 | + messages: [{ role: 'user', content: userMsg }], |
| 143 | + stream: true, |
| 144 | + }); |
| 145 | + |
| 146 | + const runner = getRunner(stream); |
| 147 | + |
| 148 | + // Method 1: Using event listeners |
| 149 | + runner.on('connect', () => { |
| 150 | + console.log('Stream started.\n'); |
| 151 | + }); |
| 152 | + |
| 153 | + runner.on('content', content => { |
| 154 | + process.stdout.write(content); |
| 155 | + }); |
| 156 | + |
| 157 | + runner.on('end', () => { |
| 158 | + console.log('\nStream ended.'); |
| 159 | + }); |
| 160 | + |
| 161 | + runner.on('error', error => { |
| 162 | + console.error('Error:', error); |
| 163 | + }); |
| 164 | +} |
| 165 | + |
| 166 | +main(); |
| 167 | +``` |
| 168 | + |
| 169 | +Make sure to install and import `dotenv` at the top if you are using Node.js: |
| 170 | + |
| 171 | +```ts |
| 172 | +import 'dotenv/config'; |
| 173 | +``` |
| 174 | + |
| 175 | +### 5. Run the AI agent |
| 176 | + |
| 177 | +To run the pipe locally, you need to start the BaseAI server. Run the following command in your terminal: |
| 178 | + |
| 179 | +```bash |
| 180 | +npx baseai@latest dev |
| 181 | +``` |
| 182 | + |
| 183 | +Now, run the `index.ts` file in your terminal: |
| 184 | + |
| 185 | +```bash |
| 186 | +npx tsx index.ts |
| 187 | +``` |
| 188 | + |
| 189 | +You should see the following output in your terminal: |
| 190 | + |
| 191 | +```md |
| 192 | +Stream started. |
| 193 | + |
| 194 | +Langbase Studio is your AI development playground. Experiment in real-time with real data, store messages, and version prompts to move from prototype to production seamlessly. |
| 195 | + |
| 196 | +Key Features: |
| 197 | +- **Collaborate**: Invite team members to build AI together. |
| 198 | +- **Inclusive Teams**: Engage all stakeholders—R&D, engineering, product, and marketing—in a shared space. It’s like GitHub combined with Google Docs for AI development. |
| 199 | +Stream ended. |
| 200 | +``` |
| 201 | +> [!TIP] |
| 202 | +> You can also run RAG locally with BaseAI. Check out memory quickstart [guide](https://baseai.dev/docs/memory/quickstart) for more details. |
| 203 | +
|
20 | 204 | ## Documentation
|
21 | 205 |
|
22 | 206 | Visit [baseai.dev/docs](https://baseai.dev/docs) for the full documentation.
|
|
0 commit comments