CodeRunner is an MCP (Model Context Protocol) server that executes AI-generated code in a sandboxed environment on your Mac using Apple's native containers.
Key use case: Process your local files (videos, images, documents, data) with remote LLMs like Claude or ChatGPT without uploading your files to the cloud. The LLM generates code that runs locally on your machine to analyze, transform, or process your files.
Without CodeRunner | With CodeRunner |
---|---|
LLM writes code, you run it manually | LLM writes and executes code, returns results |
Upload files to cloud for AI processing | Files stay on your machine, processed locally |
Install tools and dependencies yourself | Tools available in sandbox, auto-installs others |
Copy/paste scripts to run elsewhere | Code runs immediately, shows output/files |
LLM analyzes text descriptions of files | LLM directly processes your actual files |
Manage Python environments and packages | Pre-configured environment ready to use |
Prerequisites: Mac with macOS and Apple Silicon (M1/M2/M3/M4), Python 3.10+
git clone https://github.com/BandarLabs/coderunner.git
cd coderunner
chmod +x install.sh
sudo ./install.sh
MCP server will be available at: http://coderunner.local:8222/sse
Install required packages (use virtualenv and note the python path):
pip install -r examples/requirements.txt
Configure Claude Desktop to use CodeRunner as an MCP server:
-
Copy the example configuration:
cd examples cp claude_desktop/claude_desktop_config.example.json claude_desktop/claude_desktop_config.json
-
Edit the configuration file and replace the placeholder paths:
- Replace
/path/to/your/python
with your actual Python path (e.g.,/usr/bin/python3
or/opt/homebrew/bin/python3
) - Replace
/path/to/coderunner
with the actual path to your cloned repository
Example after editing:
{ "mcpServers": { "coderunner": { "command": "/opt/homebrew/bin/python3", "args": ["/Users/yourname/coderunner/examples/claude_desktop/mcpproxy.py"] } } }
- Replace
-
Update Claude Desktop configuration:
- Open Claude Desktop
- Go to Settings → Developer
- Add the MCP server configuration
- Restart Claude Desktop
-
Start using CodeRunner in Claude: You can now ask Claude to execute code, and it will run safely in the sandbox!
Use CodeRunner with OpenAI's Python agents library:
-
Set your OpenAI API key:
export OPENAI_API_KEY="your-openai-api-key-here"
-
Run the client:
python examples/openai_agents/openai_client.py
-
Start coding: Enter prompts like "write python code to generate 100 prime numbers" and watch it execute safely in the sandbox!
Gemini CLI is recently launched by Google.
~/.gemini/settings.json
{
"theme": "Default",
"selectedAuthType": "oauth-personal",
"mcpServers": {
"coderunner": {
"url": "http://coderunner.local:8222/sse"
}
}
}
Code runs in an isolated container with VM-level isolation. Your host system and files outside the sandbox remain protected.
From @apple/container:
Each container has the isolation properties of a full VM, using a minimal set of core utilities and dynamic libraries to reduce resource utilization and attack surface.
CodeRunner consists of:
- Sandbox Container: Isolated execution environment with Jupyter kernel
- MCP Server: Handles communication between AI models and the sandbox
The examples/
directory contains:
openai-agents
- Example OpenAI agents integrationclaude-desktop
- Example Claude Desktop integration
We welcome contributions! Please see CONTRIBUTING.md for guidelines.
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.