A Docker-based framework for designing and implementing AI agent workflows using the Multimodal Chat Protocol (MCP) with local Ollama models and Smithery.ai integration.
This system consists of three main components:
- MCP Server: A service that implements the MCP protocol and communicates with Ollama for model responses
- Workflow Orchestrator: Manages multi-step workflows by connecting different MCP servers and Smithery.ai agents
- MCP Client: A web interface to create and execute agent workflows
- Docker and Docker Compose
- Ollama installed locally (running on port 11434)
- Desired models pulled in Ollama (e.g.,
ollama pull llama3:latest
)
Copy the example environment file:
cp .env.example .env
Modify the following settings as needed:
OLLAMA_BASE_URL
: URL to your Ollama instance (default: http://host.docker.internal:11434)OLLAMA_MODEL
: Default model to use (default: llama3:latest)SMITHERY_API_KEY
: API key for Smithery.ai integrationSMITHERY_REGISTRY_URL
: URL for the Smithery.ai registry (default: https://registry.smithery.ai)EXTERNAL_MCP_SERVERS
: Optional comma-separated list of external MCP servers
- Start the services:
docker-compose up -d
- Access the web interface:
Open your browser and navigate to http://localhost:8002
- Create a workflow:
- Enter an input prompt
- Define one or more workflow steps
- Run the workflow
-
MCP Server: http://localhost:8000
/v1/chat
- MCP compatible chat endpoint
-
Workflow Orchestrator: http://localhost:8001
/v1/workflow
- Workflow execution endpoint/v1/mcp-servers
- List available MCP servers/v1/test-smithery
- Test connection to a Smithery.ai agent
-
MCP Client: http://localhost:8002
- Web interface for creating and running workflows
-
Create a two-step workflow:
- Step 1: "Research" - Generate information about a topic
- Step 2: "Summarize" - Take the research and create a concise summary
-
Enter a prompt like "Tell me about the history of artificial intelligence"
-
Run the workflow to get both detailed research and a summary
Edit the .env
file and add external MCP servers to the EXTERNAL_MCP_SERVERS
variable as a comma-separated list.
- Get an API key from Smithery.ai
- Add your API key to the
.env
file - In the web interface, you'll see additional fields for Smithery-specific configuration:
- Smithery Agent ID: The ID of the specific agent you want to use (e.g., "@turkyden/weather")
- Smithery Parameters: JSON object with additional parameters for the agent
The system integrates with Smithery.ai agents using the WebSocket MCP protocol, allowing for real-time interaction with specialized agents.
A complete example of integrating with the Smithery.ai weather agent is included:
# Run the example script to test weather information
cd services/workflow_orchestrator
./weather_example.py "San Francisco"
# Run the full workflow example
cd examples
./run_weather_workflow.py "New York"
See the services/workflow_orchestrator/README.md
file for detailed information on the Smithery.ai integration.
You can define and save workflow templates by modifying the client interface or creating API scripts that use the workflow orchestrator endpoint.
Modify the tools
parameter in the MCP requests to add specific capabilities to your agents.