Skip to content

Commit 4a3e477

Browse files
committed
Improvements.
1 parent 460b06f commit 4a3e477

File tree

1 file changed

+4
-4
lines changed

1 file changed

+4
-4
lines changed

MyApp/_pages/ai-server/index.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -5,17 +5,17 @@ description: Introduction to AI Server and its key features
55

66
# Overview
77

8-
AI Server is an independent microservice designed to provide a comprehensive suite of AI features for your applications. It serves as a private gateway to process LLM, AI, and image transformation requests, dynamically delegating tasks across multiple providers including Ollama, OpenRouter, Replicate, Comfy UI, Whisper, and ffmpeg.
8+
AI Server is a way to orchestrate your AI requests through a single self-hosted application to control what AI Providers you use without impacting your client integrations. It serves as a private gateway to process LLM, AI, and image transformation requests, dynamically delegating tasks across multiple providers including Ollama, OpenRouter, Replicate, Comfy UI, Whisper, and ffmpeg.
99

1010

1111
## Key Features
1212

1313
- **Unified AI Gateway**: Centralize all your AI requests through a single self-hosted service.
1414
- **Multi-Provider Support**: Seamlessly integrate with Ollama, Open AI, Comfy UI, and more.
15-
- **Type-Safe Integrations**: Native end-to-end typed integrations for popular languages including C#, TypeScript, Python, and more.
15+
- **Type-Safe Integrations**: Native end-to-end typed integrations for 11 popular languages including C#, TypeScript, Python, and more.
1616
- **Secure Access**: API key authentication to protect your AI resources.
17-
- **Managed File Storage**: Built-in CDN-hostable storage for AI-generated assets.
18-
- **Background Job Processing**: Efficient handling of long-running AI tasks.
17+
- **Managed File Storage**: Built-in cached storage for AI-generated assets.
18+
- **Background Job Processing**: Efficient handling of long-running AI tasks, capable of distributing workloads to many different providers, both managed and self-hosted.
1919
- **Custom Deployment**: Run as a single Docker container, with optional GPU-equipped agents for advanced tasks.
2020

2121
## Why Use AI Server?

0 commit comments

Comments
 (0)