|
| 1 | +# AI Features in AtomicServer |
| 2 | + |
| 3 | +AtomicServer offers powerful AI capabilities to help you automate tasks, generate content, and interact with your data in new ways. |
| 4 | +It is also just a great general purpose AI client. |
| 5 | +And if you want nothing to do with AI, you can disable it completely in the settings. |
| 6 | + |
| 7 | + |
| 8 | + |
| 9 | +AtomicServer integrates with large language models (LLMs) via two main providers: |
| 10 | + |
| 11 | +- **OpenRouter**: A cloud-based API that gives access to a wide range of commercial and open-source models (e.g., GPT-4, Claude, Mixtral, etc.). |
| 12 | +- **Ollama**: A self-hosted, local LLM server that runs models on your own hardware for privacy and offline use. |
| 13 | + |
| 14 | +## Configuring AI |
| 15 | + |
| 16 | +Before you start using the AI features you will need to configure an AI provider. This is straightforward and can be done on the settings page. |
| 17 | + |
| 18 | +### OpenRouter |
| 19 | + |
| 20 | +If you want to use OpenRouter, you will need an OpenRouter account with some credits. You can link it to AtomicServer by clicking the "Login with OpenRouter" button or pasting your API key in the text field. |
| 21 | + |
| 22 | +### Ollama |
| 23 | + |
| 24 | +Download and install [Ollama](https://ollama.ai/download) for your desired OS. |
| 25 | +Download some models from the terminal by entering. |
| 26 | + |
| 27 | +```bash |
| 28 | +ollama pull <model-name> |
| 29 | +``` |
| 30 | + |
| 31 | +Next start the server: |
| 32 | + |
| 33 | +```bash |
| 34 | +ollama serve |
| 35 | +``` |
| 36 | + |
| 37 | +Now in your AtomicServer go to the settings page, scroll down to the AI settings and under "AI Providers" click on Ollama. |
| 38 | +There you can enter the URL of your Ollama server. |
| 39 | +If you are running this server on the same machine as your browser, you can use `http://localhost:11434/api` as the URL. |
| 40 | +Next you need to configure an agent to use the model. To do this go to an AI chat or open the AI sidebar and click on the agent in the chat input. |
| 41 | +Click on the edit button and change the model to your desired Ollama model. |
| 42 | + |
| 43 | +## Using AI |
| 44 | + |
| 45 | +### AI Sidebar |
| 46 | + |
| 47 | +The fastest way to use AI is with the AI sidebar. Open it by clicking on the ✨ button in the top right corner. |
| 48 | +Chats in the AI sidebar do not persist and not shared with other users. |
| 49 | +If you want to save your chat for later reference or to continue at a later point, you can save the chat as a resource by hitting the save button in the top right corner. |
| 50 | +Once saved the chat can be used like any other resource, you can share it with other users and reference it in other resources. |
| 51 | + |
| 52 | +### Creating an AI Chat resource. |
| 53 | + |
| 54 | +You can also create AI chat resources without going through the AI sidebar first. |
| 55 | +To create a full page AI chat that persists, create a new resource and click the `✨ ai-chat` button. |
| 56 | + |
| 57 | +## Referencing resources in the chat |
| 58 | + |
| 59 | +You can reference any resource in the chat by typing the `@` symbol and continue typing the name of the resource. |
| 60 | +The resource data will then be added to the chat as context. |
| 61 | +This way you can also reference resources from your MCP servers. |
| 62 | + |
| 63 | + |
| 64 | + |
| 65 | +## AI Agents |
| 66 | + |
| 67 | +When you chat with an llm, the llm will act as a certain agent. By default this agent is the Atomic Assistant who helps you with AtomicServer stuff like answering questions about your data or searching and editing resources. |
| 68 | +There is also a general purpose agent that doesn't have any instructions for if you just want to use it as a general purpose AI client. |
| 69 | +To switch between agents, click on the agent in the chat input. |
| 70 | +In the agent configuration dialog you can edit and create new agents, set an agent as the default or toggle the automatic agent selection. |
| 71 | +The default agents, Atomic Data Agent and General Agent, are also editable so if you want to change how they work or what model they use feel free to do so. |
| 72 | + |
| 73 | +### Creating Your Own Agents |
| 74 | + |
| 75 | +You can also create your own agents that have their own system prompt, tool list and model. |
| 76 | + |
| 77 | + |
| 78 | + |
| 79 | +### Automatic Agent Selection |
| 80 | + |
| 81 | +AtomicServer features **automatic agent selection**. |
| 82 | +To enable this, click on the agent in the chat input and check "Automatic agent selection". |
| 83 | +AtomicServer will now automatically select the most relevant agent based on the context of the chat. |
| 84 | +It will look at the agent's name, description and tools to determine the most relevant agent so make sure to give your agents a good name and description. |
| 85 | + |
| 86 | +## MCP tools and resources |
| 87 | + |
| 88 | +You can add MCP servers to your AtomicServer client. |
| 89 | +To add an MCP server to your client, go to the settings page and under "MCP Servers" you can add new servers. |
| 90 | + |
| 91 | + |
| 92 | + |
| 93 | +Next you need to enable the tools for the agent you want to use them with. |
| 94 | +This can be done by editing the agent and checking any mcp server you want it to have. |
| 95 | + |
| 96 | + |
| 97 | + |
| 98 | +Your agent will now be able to call the tools of the MCP server. |
| 99 | + |
| 100 | +### MCP Resources |
| 101 | + |
| 102 | +If one of your MCP servers has resources you can reference them in the chat by typing the `@` and selecting the server from the list. |
| 103 | +Then continue typing the name of the resource you want to add to the context. |
| 104 | + |
| 105 | +### How to use STDIO MCP servers |
| 106 | + |
| 107 | +AtomicServer does not support mcp over stdio because the browser cannot access stdio. |
| 108 | +If you want to use an MCP server that only supports stdio you can use tools like [Supergateway](https://github.com/supercorp-ai/supergateway) to convert stdio to Streamable HTTP or SSE. |
| 109 | + |
| 110 | +Example: |
| 111 | + |
| 112 | +```bash |
| 113 | +npx -y supergateway \ |
| 114 | + --stdio "<MCP_SERVER_COMMAND>" \ |
| 115 | + --outputTransport streamableHttp --stateful \ |
| 116 | + --sessionTimeout 60000 --port <PORT> |
| 117 | +``` |
0 commit comments