Skip to content

factoredai/eb-llms-mcps-homework

Repository files navigation

Introduction to Model Context Protocol

Tools: MCP, LLM Agents Topics: Model Context Protocol, LLM tools

In this workshop, we'll provide an overview of how to work with Model Context Protocol (MCP), a model-agnostic protocol that has been rapidly adopted by many LLM-powered applications. We'll be developing a sales assistant that can provide information about a company's sales data. The assistant will be able to retrieve, analyze, and visualize sales information to answer queries. MCP was developed by Anthropic to establish a unified framework to provide LLMs with tools, resources, and templates in a lightweight package. Branded as the USB-C for LLMs, the protocol allows the development of plugins for AI software, which can be easily connected to any client regardless of the implementation or model being used on the client side.


Instructions

Throughout this tutorial, we will develop MCP server and client that can be used to turn any model into a business assistant. We will equip the server with tools, resources, and templates that will allow the client to extract information from the company's sales database (a sales.json file), draw conclusions from it, and render plots. We will see how the server will be compatible with any client regardless of the model or implementation of the client. You should test the tasks with both llm providers (openai and gemini).


Task 1: Local Deployment with Stdio Transport

Time: 30min

The goal of this task is to configure the MCP client to connect to a locally running server container using the stdio transport protocol. This simulates a lightweight, command-line interaction with the server.

You have a Docker image named sales_server that runs our MCP server. Your task is to update the client's mcp.json configuration file and complete the connect_to_server method in client.py to correctly initialize the stdio connection.

  1. Open the mcp.json file and replace its contents with the following configuration. This tells the client to use a stdio connection and run the server by executing a Docker command.

    {
        "llm_provider": "google-genai",
        "llm_model": "gemini-2.0-flash-001",
        "mcpServers": {
            "dockerized_server": {
                "type": "stdio",
                "command": "docker",
                "args": [
                    "run",
                    "-i",
                    "--rm",
                    "--init",
                    "-e", "DOCKER_CONTAINER=true",
                    "-v", "/Users/danielgomez/mcp/workspace:/root/mcp/workspace",
                    "sales_server"
            ]
            }
        }
    }
  2. Navigate to the client.py file and complete the code within the ## homework:start and ## homework:end blocks inside the connect_to_server method for the "stdio" transport. You'll need to create StdioServerParameters, initialize the stdio_client, and then create a new ClientSession.

  3. After completing the code, run the client to test the connection. You should be able to interact with the server and its tools.


Task 2: Containerized Server Deployment with Local Client

Time: 30min

Now, let's deploy the server as a long-running HTTP service in a Docker container. The client, running on your local machine, will connect to this container over a network port. This is a more typical production scenario.

  1. First, run the sales_server container in the background, mapping its internal port 8050 to the host's port 8050.

    docker run -p 8050:8050 --rm -d sales_server
  2. Next, modify the mcp.json file to switch the server type to "http". You will need to define the server's url (e.g., http://localhost:8050) within the configuration. Refer to the HTTPServerConfig schema for the correct structure.

  3. Finally, navigate back to the client.py file and complete the code within the ## homework:start and ## homework:end blocks for the "http" transport. You will use streamablehttp_client to create the client session.


Task 3: Docker-Compose Deployment

Time: 1h

In this final task, you will deploy both the client and server as containerized services using Docker Compose. This represents a fully containerized, microservice-based architecture.

  1. Ensure you have a docker-compose.yml file that defines services for both the client and the server.

  2. Update your mcp.json file to configure the client to connect to the server using the service name defined in docker-compose.yml. For example, if your server service is named server, the URL should be http://server:8050.

    {
        "llm_provider": "google-genai",
        "llm_model": "gemini-2.0-flash-001",
        "mcpServers": {
            "dockerized_http_server": {
                "type": "http",
                "url": "http://server:8050"
            }
        }
    }
  3. Launch the services using the following command:

    docker-compose up -d
  4. Since the client is now running inside a container, you must enter its container to interact with the LLM. Use the following command to execute a bash shell inside the client container:

    docker-compose exec -it client bash
  5. From within the client container's shell, run the client application to start the interactive session:

    python client.py

Optional tasks:

  • Experiment with the LLM. Ask it questions that require using the tools, such as "What was the total revenue in May 2024?" or "Can you show me a plot of the revenue for the first quarter of 2024?".
  • Explore the schemas/models.py file to understand how pydantic models are used to validate the configuration files for both the client and server.

Future work

  • Create a more robust client that supports multiple concurrent server connections.
  • Implement a tool that allows the LLM to write new data to the sales database.
  • Explore the use of other LLMs like OpenAI's GPT models by changing the llm_provider in the mcp.json configuration.

About

Learner's template repository based on eb-llms-mcps

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published