Skip to content

Commit 9e6ce30

Browse files
committed
Add docs for AI tools #951
1 parent 60ef1e8 commit 9e6ce30

File tree

10 files changed

+130
-0
lines changed

10 files changed

+130
-0
lines changed

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -31,6 +31,7 @@ _Status: alpha. [Breaking changes](CHANGELOG.md) are expected until 1.0._
3131
- 🔧 **Custom data models**: create your own classes, properties and schemas using the built-in Ontology Editor. All data is verified and the models are sharable using [Atomic Schema](https://docs.atomicdata.dev/schema/intro.html)
3232
- ⚙️ **Restful API**, with [JSON-AD](https://docs.atomicdata.dev/core/json-ad.html) responses.
3333
- 🔎 **Full-text search** with fuzzy search and various operators, often <3ms responses. Powered by [tantivy](https://github.com/quickwit-inc/tantivy).
34+
-**AI** with [MCP](https://modelcontextprotocol.io/) support, use any model via OpenRouter or host your own with Ollama.
3435
- 🗄️ **Tables**, with strict schema validation, keyboard support, copy / paste support. Similar to Airtable.
3536
- 📄 **Documents**, collaborative, rich text, similar to Google Docs / Notion.
3637
- 💬 **Group chat**, performant and flexible message channels with attachments, search and replies.

browser/data-browser/src/components/AI/AgentConfig.tsx

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -427,6 +427,11 @@ const AgentForm = ({ agent, onChange }: AgentFormProps) => {
427427
</CheckboxLabel>
428428
</li>
429429
))}
430+
{mcpServers.length === 0 && (
431+
<li>
432+
<SubtleText>No MCP servers configured.</SubtleText>
433+
</li>
434+
)}
430435
</ToolList>
431436
</FormGroup>
432437

@@ -581,3 +586,8 @@ const RangeInput = styled.input`
581586
flex: 1;
582587
flex-basis: 75%;
583588
`;
589+
590+
const SubtleText = styled.p`
591+
font-size: 0.875rem;
592+
color: ${p => p.theme.colors.textLight};
593+
`;

docs/.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,3 @@
11
/book
22
.DS_Store
3+
/build

docs/src/SUMMARY.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,7 @@
1414
- [Installation](atomicserver/installation.md)
1515
- [Using the GUI](atomicserver/gui.md)
1616
- [Tables](atomicserver/gui/tables.md)
17+
- [AI and Atomic Assistant](atomicserver/gui/ai-and-atomic-assistant.md)
1718
- [API](atomicserver/API.md)
1819
- [Creating a JSON-AD file](create-json-ad.md)
1920
- [FAQ & troubleshooting](atomicserver/faq.md)
Binary file not shown.
10.7 KB
Binary file not shown.
84.3 KB
Binary file not shown.
Binary file not shown.
16.7 KB
Binary file not shown.
Lines changed: 117 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,117 @@
1+
# AI Features in AtomicServer
2+
3+
AtomicServer offers powerful AI capabilities to help you automate tasks, generate content, and interact with your data in new ways.
4+
It is also just a great general purpose AI client.
5+
And if you want nothing to do with AI, you can disable it completely in the settings.
6+
7+
![AI Sidebar](../../assets/ui-guide/ai_sidebar_example.avif)
8+
9+
AtomicServer integrates with large language models (LLMs) via two main providers:
10+
11+
- **OpenRouter**: A cloud-based API that gives access to a wide range of commercial and open-source models (e.g., GPT-4, Claude, Mixtral, etc.).
12+
- **Ollama**: A self-hosted, local LLM server that runs models on your own hardware for privacy and offline use.
13+
14+
## Configuring AI
15+
16+
Before you start using the AI features you will need to configure an AI provider. This is straightforward and can be done on the settings page.
17+
18+
### OpenRouter
19+
20+
If you want to use OpenRouter, you will need an OpenRouter account with some credits. You can link it to AtomicServer by clicking the "Login with OpenRouter" button or pasting your API key in the text field.
21+
22+
### Ollama
23+
24+
Download and install [Ollama](https://ollama.ai/download) for your desired OS.
25+
Download some models from the terminal by entering.
26+
27+
```bash
28+
ollama pull <model-name>
29+
```
30+
31+
Next start the server:
32+
33+
```bash
34+
ollama serve
35+
```
36+
37+
Now in your AtomicServer go to the settings page, scroll down to the AI settings and under "AI Providers" click on Ollama.
38+
There you can enter the URL of your Ollama server.
39+
If you are running this server on the same machine as your browser, you can use `http://localhost:11434/api` as the URL.
40+
Next you need to configure an agent to use the model. To do this go to an AI chat or open the AI sidebar and click on the agent in the chat input.
41+
Click on the edit button and change the model to your desired Ollama model.
42+
43+
## Using AI
44+
45+
### AI Sidebar
46+
47+
The fastest way to use AI is with the AI sidebar. Open it by clicking on the ✨ button in the top right corner.
48+
Chats in the AI sidebar do not persist and not shared with other users.
49+
If you want to save your chat for later reference or to continue at a later point, you can save the chat as a resource by hitting the save button in the top right corner.
50+
Once saved the chat can be used like any other resource, you can share it with other users and reference it in other resources.
51+
52+
### Creating an AI Chat resource.
53+
54+
You can also create AI chat resources without going through the AI sidebar first.
55+
To create a full page AI chat that persists, create a new resource and click the `✨ ai-chat` button.
56+
57+
## Referencing resources in the chat
58+
59+
You can reference any resource in the chat by typing the `@` symbol and continue typing the name of the resource.
60+
The resource data will then be added to the chat as context.
61+
This way you can also reference resources from your MCP servers.
62+
63+
![Referencing resources in the chat](../../assets/ui-guide/adding-context-to-chat.avif)
64+
65+
## AI Agents
66+
67+
When you chat with an llm, the llm will act as a certain agent. By default this agent is the Atomic Assistant who helps you with AtomicServer stuff like answering questions about your data or searching and editing resources.
68+
There is also a general purpose agent that doesn't have any instructions for if you just want to use it as a general purpose AI client.
69+
To switch between agents, click on the agent in the chat input.
70+
In the agent configuration dialog you can edit and create new agents, set an agent as the default or toggle the automatic agent selection.
71+
The default agents, Atomic Data Agent and General Agent, are also editable so if you want to change how they work or what model they use feel free to do so.
72+
73+
### Creating Your Own Agents
74+
75+
You can also create your own agents that have their own system prompt, tool list and model.
76+
77+
![Creating an agent](../../assets/ui-guide/creating-an-ai-agent.avif)
78+
79+
### Automatic Agent Selection
80+
81+
AtomicServer features **automatic agent selection**.
82+
To enable this, click on the agent in the chat input and check "Automatic agent selection".
83+
AtomicServer will now automatically select the most relevant agent based on the context of the chat.
84+
It will look at the agent's name, description and tools to determine the most relevant agent so make sure to give your agents a good name and description.
85+
86+
## MCP tools and resources
87+
88+
You can add MCP servers to your AtomicServer client.
89+
To add an MCP server to your client, go to the settings page and under "MCP Servers" you can add new servers.
90+
91+
![Adding an MCP server](../../assets/ui-guide/adding-mcp-servers.avif)
92+
93+
Next you need to enable the tools for the agent you want to use them with.
94+
This can be done by editing the agent and checking any mcp server you want it to have.
95+
96+
![Enabling MCP tools](../../assets/ui-guide/enabling-tools.avif)
97+
98+
Your agent will now be able to call the tools of the MCP server.
99+
100+
### MCP Resources
101+
102+
If one of your MCP servers has resources you can reference them in the chat by typing the `@` and selecting the server from the list.
103+
Then continue typing the name of the resource you want to add to the context.
104+
105+
### How to use STDIO MCP servers
106+
107+
AtomicServer does not support mcp over stdio because the browser cannot access stdio.
108+
If you want to use an MCP server that only supports stdio you can use tools like [Supergateway](https://github.com/supercorp-ai/supergateway) to convert stdio to Streamable HTTP or SSE.
109+
110+
Example:
111+
112+
```bash
113+
npx -y supergateway \
114+
--stdio "<MCP_SERVER_COMMAND>" \
115+
--outputTransport streamableHttp --stateful \
116+
--sessionTimeout 60000 --port <PORT>
117+
```

0 commit comments

Comments
 (0)