Durum: Erken geliΕtirme aΕamasΔ±nda (v0.3.0-alpha)
Hedef: Γoklu AI saΔlayΔ±cΔ± desteΔi ile gΓΌvenlik odaklΔ± geliΕtirme asistanΔ±
Next-Generation AI-Powered Development Platform with Multi-Provider Support
- OpenRouter - Access to 100+ AI models through one API
- OpenAI - GPT-4o, GPT-4o-mini, GPT-3.5-turbo
- Anthropic - Claude 3.5 Sonnet, Claude 3 Haiku
- Google - Gemini Pro, Gemini Flash
- Groq - Ultra-fast Llama 3.1, Mixtral models
- Together AI - Open-source model hosting
- Cohere - Command R+ models
- Ollama - Local model execution (always-available fallback)
- Priority-based routing - Use your preferred providers first
- Cost optimization - Automatically select cheapest available provider
- Performance optimization - Route to fastest responding provider
- Automatic failover - Seamless fallback when providers fail
- Load balancing - Distribute requests across providers
- Code Completion - Context-aware suggestions in any language
- Code Analysis - Security, performance, quality analysis
- Documentation Generation - Auto-generate comprehensive docs
- Test Generation - Create unit tests automatically
- Code Explanation - Understand complex code instantly
- Code Refactoring - Improve code structure and quality
- Language Translation - Convert code between languages
- Real-time Streaming - Live completion responses
- Cost Tracking - Monitor API usage and costs
- Analytics Dashboard - Provider performance metrics
- Rate Limiting - Control API usage
- Authentication - JWT-based security
- Caching - Reduce API calls and costs
- Health Monitoring - Real-time provider status
git clone https://github.com/Tehlikeli107/universal-ai-dev-assistant.git
cd universal-ai-dev-assistant
cp .env.example .env
# Edit .env with your API keys
docker-compose up -d
# Backend
cd backend
cargo run
# Frontend (in another terminal)
cd frontend
npm install
npm start
# OpenRouter (Recommended - Access to 100+ models)
OPENROUTER_API_KEY=your_openrouter_key_here
# OpenAI
OPENAI_API_KEY=your_openai_key_here
# Anthropic Claude
ANTHROPIC_API_KEY=your_anthropic_key_here
# Google Gemini
GOOGLE_API_KEY=your_google_key_here
# Groq (Free tier available)
GROQ_API_KEY=your_groq_key_here
# Ollama (Local - No API key needed)
OLLAMA_BASE_URL=http://localhost:11434
# Higher number = higher priority
OPENROUTER_PRIORITY=9
OPENAI_PRIORITY=8
ANTHROPIC_PRIORITY=8
GROQ_PRIORITY=6
OLLAMA_PRIORITY=3
curl http://localhost:8080/health
curl -X POST http://localhost:8080/api/v1/complete \
-H "Content-Type: application/json" \
-d '{
"prompt": "def fibonacci(n):",
"language": "python",
"max_tokens": 100
}'
curl -X POST http://localhost:8080/api/v1/analyze \
-H "Content-Type: application/json" \
-d '{
"code": "function add(a, b) { return a + b; }",
"language": "javascript",
"analysis_type": "security"
}'
# Generate documentation
curl -X POST http://localhost:8080/api/v1/code/action \
-H "Content-Type: application/json" \
-d '{
"code": "def quicksort(arr): ...",
"language": "python",
"action": "document"
}'
# Generate tests
curl -X POST http://localhost:8080/api/v1/code/action \
-H "Content-Type: application/json" \
-d '{
"code": "function add(a, b) { return a + b; }",
"language": "javascript",
"action": "test"
}'
# Translate code
curl -X POST http://localhost:8080/api/v1/code/action \
-H "Content-Type: application/json" \
-d '{
"code": "def hello(): print(\"Hello\")",
"language": "python",
"action": "translate",
"target_language": "rust"
}'
# List available providers
curl http://localhost:8080/api/v1/providers
# List available models
curl http://localhost:8080/api/v1/models
# Get metrics
curl http://localhost:8080/api/v1/metrics
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Frontend β β Backend β β AI Providers β
β (React) βββββΊβ (Rust) βββββΊβ (Multiple) β
β β β β β β
β β’ Dashboard β β β’ Provider β β β’ OpenRouter β
β β’ Code Editor β β Router β β β’ OpenAI β
β β’ Analytics β β β’ Load Balancer β β β’ Anthropic β
β β’ Settings β β β’ Cost Tracker β β β’ Google β
βββββββββββββββββββ β β’ Health Monitorβ β β’ Groq β
β β’ Rate Limiter β β β’ Ollama β
βββββββββββββββββββ β β’ Caching β β β’ Together β
β VSCode Ext βββββΊβ β’ Analytics β β β’ Cohere β
β β βββββββββββββββββββ βββββββββββββββββββ
β β’ Completions β
β β’ Code Actions β
β β’ Diagnostics β
βββββββββββββββββββ
- Modern React Dashboard - Beautiful, responsive UI
- Real-time Provider Status - Live health monitoring
- Cost Analytics - Track usage and spending
- Model Comparison - Compare provider performance
- Code Playground - Test completions interactively
- Settings Management - Configure providers and preferences
- Intelligent Code Completion - Context-aware suggestions
- Code Actions - Quick fixes and improvements
- Hover Documentation - Instant code explanations
- Diagnostics - Real-time code analysis
- Multi-provider Support - Choose your preferred AI
# docker-compose.yml
version: '3.8'
services:
backend:
build: ./backend
ports:
- "8080:8080"
environment:
- OPENROUTER_API_KEY=${OPENROUTER_API_KEY}
- OPENAI_API_KEY=${OPENAI_API_KEY}
frontend:
build: ./frontend
ports:
- "3000:3000"
depends_on:
- backend
- Sub-100ms Response Times - Optimized for speed
- Horizontal Scaling - Multiple backend instances
- Intelligent Caching - Reduce API calls by 60%
- Connection Pooling - Efficient resource usage
- Rate Limiting - Prevent API abuse
- Health Checks - Automatic failover
- Provider Cost Comparison - Always use cheapest option
- Usage Analytics - Track spending per provider
- Free Tier Maximization - Use free providers first
- Caching Strategy - Avoid duplicate API calls
- Token Optimization - Minimize prompt sizes
- API Key Management - Secure credential storage
- Rate Limiting - Prevent abuse
- Input Validation - Sanitize all inputs
- CORS Protection - Secure cross-origin requests
- JWT Authentication - Secure API access
- Audit Logging - Track all API usage
# Backend tests
cd backend
cargo test
# Frontend tests
cd frontend
npm test
# Integration tests
npm run test:integration
# Load testing
npm run test:load
- Fork the repository
- Create a feature branch:
git checkout -b feature/amazing-feature
- Commit changes:
git commit -m 'Add amazing feature'
- Push to branch:
git push origin feature/amazing-feature
- Open a Pull Request
This project is licensed under the MIT License - see the LICENSE file for details.
- OpenRouter - For providing access to multiple AI models
- Rust Community - For the amazing ecosystem
- React Team - For the excellent frontend framework
- All AI Providers - For making this possible
- GitHub Issues - Bug reports and feature requests
- Discussions - Community support and ideas
- Documentation - Comprehensive guides and examples
Made with β€οΈ by the Universal AI Development Assistant Team
Empowering developers with the best AI tools available