Skip to content

Commit 1a0165c

Browse files
committed
Add initial ai-server quickstart.
1 parent 09787bb commit 1a0165c

File tree

2 files changed

+112
-0
lines changed

2 files changed

+112
-0
lines changed

MyApp/_pages/ai-server/index.md

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
---
2+
title: Overview
3+
description: Introduction to AI Server and its key features
4+
---
5+
6+
# Overview
7+
8+
AI Server is an independent microservice designed to provide a comprehensive suite of AI features for your applications. It serves as a private gateway to process LLM, AI, and image transformation requests, dynamically delegating tasks across multiple providers including Ollama, OpenRouter, Replicate, Comfy UI, Whisper, and ffmpeg.
9+
10+
11+
## Key Features
12+
13+
- **Unified AI Gateway**: Centralize all your AI requests through a single self-hosted service.
14+
- **Multi-Provider Support**: Seamlessly integrate with Ollama, Open AI, Comfy UI, and more.
15+
- **Type-Safe Integrations**: Native end-to-end typed integrations for popular languages including C#, TypeScript, Python, and more.
16+
- **Secure Access**: API key authentication to protect your AI resources.
17+
- **Managed File Storage**: Built-in CDN-hostable storage for AI-generated assets.
18+
- **Background Job Processing**: Efficient handling of long-running AI tasks.
19+
- **Custom Deployment**: Run as a single Docker container, with optional GPU-equipped agents for advanced tasks.
20+
21+
## Why Use AI Server?
22+
23+
AI Server simplifies the integration and management of AI capabilities in your applications:
24+
25+
1. **Centralized Management**: Manage all your AI providers and requests from a single interface.
26+
2. **Cost Control**: Monitor and control usage across your organization with detailed request history.
27+
3. **Flexibility**: Easy to scale and adapt as your AI needs evolve.
28+
4. **Security**: Keep your AI operations behind your firewall with a private, managed gateway.
29+
5. **Developer-Friendly**: Type-safe APIs and integrations for a smooth development experience.
30+
31+
## Supported AI Capabilities
32+
33+
- **Large Language Models**: Integrate with Open AI Chat, Ollama, and various API gateways.
34+
- **Image Generation and Manipulation**: Leverage Comfy UI for text-to-image, image-to-image, and more.
35+
- **Audio Processing**: Text-to-speech, speech-to-text, and audio manipulations.
36+
- **Video Processing**: Format conversions, scaling, cropping, and more with ffmpeg integration.
37+
38+
## Getting Started for Developers
39+
40+
1. **Setup**: Follow the Quick Start guide to deploy AI Server.
41+
2. **Configuration**: Use the Admin UI to add your AI providers and generate API keys.
42+
3. **Integration**: Choose your preferred language and use ServiceStack's Add ServiceStack Reference to generate type-safe client libraries.
43+
4. **Development**: Start making API calls to AI Server from your application, leveraging the full suite of AI capabilities.
44+
45+
## Learn More
46+
47+
- Website: [openai.servicestack.net](https://openai.servicestack.net)
48+
- GitHub: [github.com/ServiceStack/ai-server](https://github.com/ServiceStack/ai-server)
49+
50+
AI Server is actively developed and continuously expanding its capabilities. Stay tuned for updates and new features as we work towards our first V1 release.
Lines changed: 62 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,62 @@
1+
---
2+
title: Quick Start
3+
description: Get AI Server up and running quickly
4+
---
5+
6+
## Quick Start
7+
8+
To get started with AI Server, we need can use the following steps:
9+
10+
- **Clone the Repository**: Clone the AI Server repository from GitHub.
11+
- **Edit the example.env File**: Update the example.env file with your desired settings.
12+
- **Run the Docker Compose**: Start the AI Server with Docker Compose.
13+
14+
### Clone the Repository
15+
16+
Clone the AI Server repository from GitHub:
17+
18+
```sh
19+
git clone https://github.com/ServiceStack/ai-server
20+
```
21+
22+
### Edit the example.env File
23+
24+
Create your own `.env` file by copying the `example.env` file:
25+
26+
```sh
27+
cp example.env .env
28+
```
29+
30+
And then edit the `.env` file with your desired settings:
31+
32+
```sh
33+
# OpenAI API Key - Head to https://platform.openai.com/account/api-keys to get your API key
34+
# OPENAI_API_KEY=your-openai-api-key
35+
# Google Cloud API Key - Head to https://console.cloud.google.com/apis/credentials to get your API key
36+
# GOOGLE_API_KEY=your-google-api-key
37+
# OpenRouter API Key - Head to https://openrouter.io/ to get your API key
38+
# OPENROUTER_API_KEY=your-openrouter-api-key
39+
# Mistral API Key - Head to https://mistral.ai/ to get your API key
40+
# MISTRAL_API_KEY=your-mistral-api-key
41+
# GROQ API Key - Head to https://groq.com/ to get your API key
42+
# GROQ_API_KEY=your-groq-api-key
43+
# Custom Port for the AI Server
44+
PORT=5005
45+
```
46+
47+
These keys are used during the AI Server initial database setup to configure the AI providers based on the keys you *uncomment and provide*.
48+
49+
### Run the Docker Compose
50+
51+
Start the AI Server with Docker Compose:
52+
53+
```sh
54+
docker compose up
55+
```
56+
57+
## Accessing AI Server
58+
59+
Once the AI Server is running, you can access the Admin UI at [http://localhost:5005](http://localhost:5005) to configure your AI providers and generate API keys.
60+
If you first ran the AI Server with configured API Keys in your `.env` file, you providers will be automatically configured for the related services.
61+
62+
> You can reset the process by deleting your local `App_Data` directory and rerunning `docker compose up`.

0 commit comments

Comments
 (0)