A Model Context Protocol (MCP) server that provides tools for interacting with Azure Machine Learning resources. This server can be deployed to Azure with managed identity authentication.
- List Azure ML models, datasets, and compute targets
- Managed identity authentication for Azure deployments
- Fallback to Azure CLI and service principal authentication for local development
- SSE (Server-Sent Events) transport for Azure AI Agents integration
- Health check endpoint for monitoring
- Python 3.11+
- Azure CLI (for local authentication)
- Docker (optional, for containerized testing)
- Clone the repository and install dependencies:
pip install -r requirements.txt
- Copy the example environment file and configure it:
cp .env.example .env
# Edit .env with your Azure ML workspace details
- Authenticate with Azure CLI:
az login
- Run the MCP server:
python mcp_server.py
The server will be available at http://localhost:8080/sse
- Build and run with Docker Compose:
docker-compose up --build
- Build and push your container image to Azure Container Registry:
az acr build --registry your-acr-name --image azureml-mcp-server:latest .
- Deploy to ACI with managed identity:
.\deploy-aci.ps1 -ResourceGroupName "your-rg" -ContainerGroupName "azureml-mcp" -SubscriptionId "your-sub-id" -AzureMLResourceGroup "your-ml-rg" -AzureMLWorkspaceName "your-workspace"
- Deploy to App Service:
.\deploy-appservice.ps1 -ResourceGroupName "your-rg" -AppServiceName "azureml-mcp-app" -SubscriptionId "your-sub-id" -AzureMLResourceGroup "your-ml-rg" -AzureMLWorkspaceName "your-workspace"
- Configure continuous deployment from your Git repository.
Once deployed, you can use the MCP server with Azure AI Agents:
agent = project_client.create_agent(
model="gpt-4",
name="azureml-agent",
instructions="You are an Azure ML assistant...",
tools=[{
"type": "mcp",
"server_label": "azureml_mcp",
"server_url": "https://your-server-url/sse",
"require_approval": "never"
}]
)
For the MCP server to access Azure ML resources using managed identity:
- Assign the managed identity to your container/app service during deployment
- Grant permissions to the managed identity:
# Get the managed identity principal ID
PRINCIPAL_ID=$(az container show --name azureml-mcp --resource-group your-rg --query identity.principalId -o tsv)
# Assign Contributor role to the Azure ML workspace
az role assignment create \
--assignee $PRINCIPAL_ID \
--role "Contributor" \
--scope "/subscriptions/YOUR_SUBSCRIPTION_ID/resourceGroups/YOUR_ML_RG/providers/Microsoft.MachineLearningServices/workspaces/YOUR_WORKSPACE"
The MCP server provides the following tools:
list_azureml_models
: List all models in an Azure ML workspacelist_azureml_datasets
: List all datasets in an Azure ML workspacelist_azureml_computes
: List all compute targets in an Azure ML workspace
Each tool requires:
subscription_id
: Azure subscription IDresource_group
: Resource group containing the ML workspaceworkspace_name
: Name of the ML workspace
- Health Check:
GET /health
- Returns server health status - Logs: Check container logs for authentication and operation details
- Local Development: Ensure
az login
is completed - Azure Deployment: Verify managed identity is assigned and has proper permissions
- Service Principal: Set
AZURE_CLIENT_ID
,AZURE_CLIENT_SECRET
, andAZURE_TENANT_ID
environment variables
- Network: Ensure the server is accessible from Azure AI Agents service
- Firewall: Check that port 8080 is open
- SSL: Use HTTPS endpoints for production deployments
"No valid Azure credentials found"
: Authentication setup required"ResourceNotFoundError"
: Check workspace name and permissions"SSE connection error"
: Network connectivity issue
- Use managed identity in production
- Never commit service principal credentials to source control
- Run containers as non-root user (already configured)
- Use HTTPS in production deployments
- Regularly rotate any service principal credentials