π€ Powered by OSS-20B | π 100% Offline & Private | π» Universal Transport | β‘ One-Command Setup
MCPlease is a universal MCP (Model Context Protocol) server that works everywhere - in your IDE, with Continue.dev, or directly via CLI. It provides intelligent code completion, explanations, and debugging help using a locally-hosted language model, ensuring your code never leaves your machine.
Before MCPlease: Multiple setup steps, manual configuration, transport-specific setup
With MCPlease: Run ./start.sh
and it works everywhere - IDE, Continue.dev, CLI, Docker
Built for developers who want AI coding assistance that works in any environment without cloud dependencies.
- π Complete Privacy: All AI processing happens locally - your code never leaves your machine
- π Universal Transport: Works via stdio (IDE) or HTTP (Continue.dev, web clients)
- β‘ One-Command Setup:
./start.sh
auto-detects environment and configures everything - π§ Professional AI: Full OSS-20B model for production-quality coding assistance
- π» Cross-Platform: Works on macOS, Linux, and Windows
- π IDE Ready: Seamless integration with Cursor, VS Code, and Continue.dev
- π AI-Native: Built for developers who code with AI
- Python 3.9+
- 15GB+ free disk space (for OSS-20B model)
One command works on any system:
# macOS/Linux
./install.sh
# Windows
install.bat
The installer automatically:
- β Detects your OS (macOS, Ubuntu, CentOS, Arch, Windows)
- β Finds package manager (apt, yum, dnf, pacman, brew, chocolatey)
- β Installs Python if needed
- β Installs Docker (optional)
- β Creates virtual environment
- β Installs dependencies
- β Downloads OSS-20B model (optional)
- β Sets up IDE configurations
- β Tests everything
π§ͺ Test the Installer First (Recommended)
Test what the installer would do without installing anything:
# Test individual functions
make test-installer
# Show complete installation plan
make test-installer-dry-run
π Choose Your Setup Method
π― Option 1: Universal Installer (Recommended)
One command works on any system:
# macOS/Linux
./install.sh
# Windows
install.bat
The installer automatically detects your system and sets up everything.
π³ Option 2: Docker Setup
One command starts everything:
# Simple Docker (default)
./start-docker.sh
# Production Docker Stack
./start-docker.sh prod
# Development Docker Stack
./start-docker.sh dev
π οΈ Option 3: Manual Setup
For advanced users who prefer manual configuration:
# Setup virtual environment
python3 -m venv .venv
source .venv/bin/activate # On Windows: .venv\Scripts\activate
# Install dependencies
pip install -r requirements.txt
# Start server manually
python mcplease_mcp_server.py --transport stdio # For IDE
python mcplease_http_server.py --port 8000 # For HTTP
π³ Choose Your Docker Setup
π Simple Docker (Default)
Perfect for development and testing:
./start-docker.sh
# or
make docker-start
Features:
- Single container with MCP server
- HTTP transport on port 8000
- Health checks enabled
- Auto-restart on failure
π Production Docker Stack
Enterprise-grade with monitoring:
./start-docker.sh prod
# or
make docker-prod
Features:
- Load balanced MCP servers (2x instances)
- HAProxy load balancer
- Monitoring stack (Prometheus, Grafana, Loki)
- High availability with health checks
- Alerting via Alertmanager
π§ Development Docker Stack
Team development with hot reload:
./start-docker.sh dev
# or
make docker-dev
Features:
- Hot reload for development
- Nginx reverse proxy
- Redis caching
- Volume mounting for live code changes
# Stop containers
make docker-stop
# View logs
make docker-logs
# Or direct commands
docker-compose down
docker-compose logs -f
π Choose Your Transport
π» IDE Integration (stdio) - Default
Works in Cursor, VS Code:
./start.sh
Features:
- Protocol: MCP via stdio
- Setup: Automatic IDE detection and configuration
- Use: Workspace Tools β MCP β MCPlease
- Performance: Direct communication, no network overhead
π Continue.dev Integration (HTTP)
Works with Continue.dev, web clients:
./start.sh --http
Features:
- Protocol: HTTP REST API
- Port: Auto-detects available port (8000+)
- Use: Continue.dev extension or direct HTTP calls
- Access: From any device on your network
β¨οΈ CLI Direct (stdio)
Terminal automation and scripts:
python mcplease_mcp_server.py --transport stdio
Features:
- Protocol: MCP via stdio
- Use: Direct command-line interaction
- Automation: Perfect for CI/CD and scripts
- Integration: Easy to pipe into other tools
π§ Setup Your IDE
π€ Automatic Setup (Recommended)
One command configures everything:
python scripts/setup_ide.py
This creates MCP configurations for:
- Cursor:
~/.cursor/mcp.json
- VS Code:
~/.vscode/mcp.json
- Continue.dev:
.continue/config.json
βοΈ Manual Configuration
For Cursor/VS Code, create ~/.cursor/mcp.json
:
{
"mcpServers": {
"MCPlease": {
"command": "/path/to/mcplease/.venv/bin/python",
"args": ["-u", "mcplease_mcp_server.py"],
"cwd": "/path/to/mcplease",
"env": {"PYTHONUNBUFFERED": "1"},
"enabled": true
}
}
}
π οΈ MCP Tools Overview
MCPlease provides these MCP tools:
File Operations:
file/read
- Read file contents for analysisfile/write
- Write or modify file contentfile/list
- List files in directory
Terminal & Code:
terminal/run
- Execute terminal commandscodebase/search
- Search codebase for patterns
AI-Powered Tools:
ai/analyze
- Analyze code using OSS-20B AIai/build
- Generate code using OSS-20B AI
System Tools:
health/check
- Server health and status
π Performance Metrics
Response Times:
- Fallback Mode: Instant responses
- AI Mode: Model-dependent (first call loads model, subsequent calls are fast)
Resource Usage:
- Memory: Minimal (fallback mode), optimized (AI mode)
- Setup Time: ~10 seconds total
- Context Length: Unlimited (fallback mode), model-dependent (AI mode)
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β IDE/Client βββββΊβ MCP Server βββββΊβ OSS-20B AI β
β (Cursor/VS) β β (Universal) β β (Local) β
βββββββββββββββββββ ββββββββββββββββββββ βββββββββββββββββββ
β
ββββββββββββββββββββ
β Transport Layer β
β stdio | HTTP β
ββββββββββββββββββββ
π§ͺ Development & Testing
π Project Structure
mcplease/
βββ start.sh # Universal startup script
βββ mcplease_mcp_server.py # MCP server with stdio transport
βββ mcplease_http_server.py # HTTP server for Continue.dev
βββ scripts/ # Setup and utility scripts
βββ tests/ # Comprehensive test suite
βββ docs/ # Documentation
π§ͺ Running Tests
# Test all transports
python scripts/test_transports.py
# Run comprehensive test suite
python scripts/run_comprehensive_tests.py
# Run specific test categories
python -m pytest tests/ -v
π Testing Transports
# Test stdio transport
python mcplease_mcp_server.py --transport stdio
# Test HTTP transport
python mcplease_http_server.py --port 8000
# Test both transports
python scripts/test_transports.py
π Production Setup
π³ Docker Deployment
# Build and run with Docker Compose
docker-compose -f docker-compose.production.yml up -d
# Includes HAProxy, monitoring, and health checks
π Monitoring Stack
- Prometheus metrics collection
- Grafana dashboards
- Loki log aggregation
- Alertmanager notifications
We welcome contributions!
- Fork the repository
- Create a feature branch:
git checkout -b feature-name
- Make your changes and add tests
- Submit a pull request
This project is licensed under the MIT License.
π Common Issues & Solutions
π Transport Issues
- stdio not working: Run
python scripts/setup_ide.py
and restart IDE - HTTP not working: Check if port is available, try
./start.sh --http
- IDE not detecting: Verify MCP configuration in
~/.cursor/mcp.json
π§ Model Issues
- Model not found: Run
python download_model.py
first - Slow responses: OSS-20B model loads on first use, subsequent calls are fast
- Memory issues: Model uses ~13GB RAM, ensure sufficient memory
- Documentation: Check our documentation files
- Issues: Report bugs on GitHub Issues
- Questions: Open a GitHub Discussion
- MCP Protocol for the universal server-client communication
- FastAPI for the high-performance HTTP server
- OSS-20B for the powerful local AI model