Skip to content

hideya/langchain-mcp-tools-py-usage

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

69 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MCP Tools Usage From LangChain / Example in Python License: MIT

This simple Model Context Protocol (MCP) client demonstrates the use of MCP server tools by LangChain ReAct Agent.

It leverages a utility function convert_mcp_to_langchain_tools() from langchain_mcp_tools.
This function handles parallel initialization of specified multiple MCP servers and converts their available tools into a list of LangChain-compatible tools (list[BaseTool]).

Google GenAI's gemini-2.0-flash is used as the LLM. For convenience, code for OpenAI's and Anthropic's LLMs are also included and commented out.

A bit more realistic (conversational) MCP Client is available here

A typescript equivalent of this MCP client is available here

Prerequisites

Usage

  1. Install dependencies:

    make install
  2. Setup API key:

    cp .env.template .env
    • Update .env as needed.
    • .gitignore is configured to ignore .env to prevent accidental commits of the credentials.
  3. Run the app:

    make start

    It takes a while on the first run.

Simple Exapmle Code for Streamable HTTP Authentiocation

A simple example of showing how to implement an OAuth client provider and use it with the langchain-mcp-tools library can be found in src/streamable_http_oauth_test_client.py.

For testing purposes, a sample MCP server with OAuth authentication support that works with the above client is provided in src/streamable_http_oauth_test_server.py.

You can run the server with make run-streamable-http-oauth-test-server and the client with make run-streamable-http-oauth-test-client.

About

MCP Tools Usage From LangChain ReAct Agent / Example in Python

Topics

Resources

License

Stars

Watchers

Forks