Skip to content

πŸš€ mcp-use is a TypeScript library that makes it easy to connect LangChain.js-compatible LLMs with MCP servers. Build powerful, flexible AI agents with dynamic tool access and multi-server support.

License

Notifications You must be signed in to change notification settings

mcp-use/mcp-use-ts

Repository files navigation

mcp use logo

Unified MCP Client Library

🌐 MCP Client is the open-source way to connect any LLM to any MCP server in TypeScript/Node.js, letting you build custom agents with tool access without closed-source dependencies.

πŸ’‘ Let developers easily connect any LLM via LangChain.js to tools like web browsing, file operations, 3D modeling, and more.


✨ Key Features

Feature Description
πŸ”„ Ease of use Create an MCP-capable agent in just a few lines of TypeScript.
πŸ€– LLM Flexibility Works with any LangChain.js-supported LLM that supports tool calling.
🌐 HTTP Support Direct SSE/HTTP connection to MCP servers.
βš™οΈ Dynamic Server Selection Agents select the right MCP server from a pool on the fly.
🧩 Multi-Server Support Use multiple MCP servers in one agent.
πŸ›‘οΈ Tool Restrictions Restrict unsafe tools like filesystem or network.
πŸ”§ Custom Agents Build your own agents with LangChain.js adapter or implement new adapters.

πŸš€ Quick Start

Requirements

  • Node.js 22.0.0 or higher
  • npm, yarn, or pnpm (examples use pnpm)

Installation

# Install from npm
npm install mcp-use
# LangChain.js and your LLM provider (e.g., OpenAI)
npm install langchain @langchain/openai dotenv

Create a .env:

OPENAI_API_KEY=your_api_key

Basic Usage

import { ChatOpenAI } from '@langchain/openai'
import { MCPAgent, MCPClient } from 'mcp-use'
import 'dotenv/config'

async function main() {
  // 1. Configure MCP servers
  const config = {
    mcpServers: {
      playwright: { command: 'npx', args: ['@playwright/mcp@latest'] }
    }
  }
  const client = MCPClient.fromDict(config)

  // 2. Create LLM
  const llm = new ChatOpenAI({ modelName: 'gpt-4o' })

  // 3. Instantiate agent
  const agent = new MCPAgent({ llm, client, maxSteps: 20 })

  // 4. Run query
  const result = await agent.run('Find the best restaurant in Tokyo using Google Search')
  console.log('Result:', result)
}

main().catch(console.error)

πŸ”§ API Methods

MCPAgent Methods

The MCPAgent class provides several methods for executing queries with different output formats:

run(query: string, maxSteps?: number): Promise<string>

Executes a query and returns the final result as a string.

const result = await agent.run('What tools are available?')
console.log(result)

stream(query: string, maxSteps?: number): AsyncGenerator<AgentStep, string, void>

Yields intermediate steps during execution, providing visibility into the agent's reasoning process.

const stream = agent.stream('Search for restaurants in Tokyo')
for await (const step of stream) {
  console.log(`Tool: ${step.action.tool}, Input: ${step.action.toolInput}`)
  console.log(`Result: ${step.observation}`)
}

streamEvents(query: string, maxSteps?: number): AsyncGenerator<StreamEvent, void, void>

Yields fine-grained LangChain StreamEvent objects, enabling token-by-token streaming and detailed event tracking.

const eventStream = agent.streamEvents('What is the weather today?')
for await (const event of eventStream) {
  // Handle different event types
  switch (event.event) {
    case 'on_chat_model_stream':
      // Token-by-token streaming from the LLM
      if (event.data?.chunk?.content) {
        process.stdout.write(event.data.chunk.content)
      }
      break
    case 'on_tool_start':
      console.log(`\nTool started: ${event.name}`)
      break
    case 'on_tool_end':
      console.log(`Tool completed: ${event.name}`)
      break
  }
}

Key Differences

  • run(): Best for simple queries where you only need the final result
  • stream(): Best for debugging and understanding the agent's tool usage
  • streamEvents(): Best for real-time UI updates with token-level streaming

πŸ”„ AI SDK Integration

The library provides built-in utilities for integrating with Vercel AI SDK, making it easy to build streaming UIs with React hooks like useCompletion and useChat.

Installation

npm install ai @langchain/anthropic

Basic Usage

import { ChatAnthropic } from '@langchain/anthropic'
import { LangChainAdapter } from 'ai'
import { createReadableStreamFromGenerator, MCPAgent, MCPClient, streamEventsToAISDK } from 'mcp-use'

async function createApiHandler() {
  const config = {
    mcpServers: {
      everything: { command: 'npx', args: ['-y', '@modelcontextprotocol/server-everything'] }
    }
  }

  const client = new MCPClient(config)
  const llm = new ChatAnthropic({ model: 'claude-sonnet-4-20250514' })
  const agent = new MCPAgent({ llm, client, maxSteps: 5 })

  return async (request: { prompt: string }) => {
    const streamEvents = agent.streamEvents(request.prompt)
    const aiSDKStream = streamEventsToAISDK(streamEvents)
    const readableStream = createReadableStreamFromGenerator(aiSDKStream)

    return LangChainAdapter.toDataStreamResponse(readableStream)
  }
}

Enhanced Usage with Tool Visibility

import { streamEventsToAISDKWithTools } from 'mcp-use'

async function createEnhancedApiHandler() {
  const config = {
    mcpServers: {
      everything: { command: 'npx', args: ['-y', '@modelcontextprotocol/server-everything'] }
    }
  }

  const client = new MCPClient(config)
  const llm = new ChatAnthropic({ model: 'claude-sonnet-4-20250514' })
  const agent = new MCPAgent({ llm, client, maxSteps: 8 })

  return async (request: { prompt: string }) => {
    const streamEvents = agent.streamEvents(request.prompt)
    // Enhanced stream includes tool usage notifications
    const enhancedStream = streamEventsToAISDKWithTools(streamEvents)
    const readableStream = createReadableStreamFromGenerator(enhancedStream)

    return LangChainAdapter.toDataStreamResponse(readableStream)
  }
}

Next.js API Route Example

// pages/api/chat.ts or app/api/chat/route.ts
import { ChatAnthropic } from '@langchain/anthropic'
import { LangChainAdapter } from 'ai'
import { createReadableStreamFromGenerator, MCPAgent, MCPClient, streamEventsToAISDK } from 'mcp-use'

export async function POST(req: Request) {
  const { prompt } = await req.json()

  const config = {
    mcpServers: {
      everything: { command: 'npx', args: ['-y', '@modelcontextprotocol/server-everything'] }
    }
  }

  const client = new MCPClient(config)
  const llm = new ChatAnthropic({ model: 'claude-sonnet-4-20250514' })
  const agent = new MCPAgent({ llm, client, maxSteps: 10 })

  try {
    const streamEvents = agent.streamEvents(prompt)
    const aiSDKStream = streamEventsToAISDK(streamEvents)
    const readableStream = createReadableStreamFromGenerator(aiSDKStream)

    return LangChainAdapter.toDataStreamResponse(readableStream)
  }
  finally {
    await client.closeAllSessions()
  }
}

Frontend Integration

// components/Chat.tsx
import { useCompletion } from 'ai/react'

export function Chat() {
  const { completion, input, handleInputChange, handleSubmit } = useCompletion({
    api: '/api/chat',
  })

  return (
    <div>
      <div>{completion}</div>
      <form onSubmit={handleSubmit}>
        <input
          value={input}
          onChange={handleInputChange}
          placeholder="Ask me anything..."
        />
      </form>
    </div>
  )
}

Available AI SDK Utilities

  • streamEventsToAISDK(): Converts streamEvents to basic text stream
  • streamEventsToAISDKWithTools(): Enhanced stream with tool usage notifications
  • createReadableStreamFromGenerator(): Converts async generator to ReadableStream

πŸ“‚ Configuration File

You can store servers in a JSON file:

{
  "mcpServers": {
    "playwright": {
      "command": "npx",
      "args": ["@playwright/mcp@latest"]
    }
  }
}

Load it:

import { MCPClient } from 'mcp-use'

const client = MCPClient.fromConfigFile('./mcp-config.json')

πŸ“š Examples

We provide a comprehensive set of examples demonstrating various use cases. All examples are located in the examples/ directory with a dedicated README.

Running Examples

# Install dependencies
npm install

# Run any example
npm run example:airbnb      # Search accommodations with Airbnb
npm run example:browser     # Browser automation with Playwright
npm run example:chat        # Interactive chat with memory
npm run example:stream      # Demonstrate streaming methods (stream & streamEvents)
npm run example:stream_events # Comprehensive streamEvents() examples
npm run example:ai_sdk      # AI SDK integration with streaming
npm run example:filesystem  # File system operations
npm run example:http        # HTTP server connection
npm run example:everything  # Test MCP functionalities
npm run example:multi       # Multiple servers in one session

Example Highlights

  • Browser Automation: Control browsers to navigate websites and extract information
  • File Operations: Read, write, and manipulate files through MCP
  • Multi-Server: Combine multiple MCP servers (Airbnb + Browser) in a single task
  • Sandboxed Execution: Run MCP servers in isolated E2B containers
  • OAuth Flows: Authenticate with services like Linear using OAuth2
  • Streaming Methods: Demonstrate both step-by-step and token-level streaming
  • AI SDK Integration: Build streaming UIs with Vercel AI SDK and React hooks

See the examples README for detailed documentation and prerequisites.


πŸ”„ Multi-Server Example

const config = {
  mcpServers: {
    airbnb: { command: 'npx', args: ['@openbnb/mcp-server-airbnb'] },
    playwright: { command: 'npx', args: ['@playwright/mcp@latest'] }
  }
}
const client = MCPClient.fromDict(config)
const agent = new MCPAgent({ llm, client, useServerManager: true })
await agent.run('Search Airbnb in Barcelona, then Google restaurants nearby')

πŸ”’ Tool Access Control

const agent = new MCPAgent({
  llm,
  client,
  disallowedTools: ['file_system', 'network']
})

πŸ‘₯ Contributors

Pietro
Pietro Zullo
Zane/
Zane
Luigi
Luigi Pederzani

πŸ“œ License

MIT Β© Zane

About

πŸš€ mcp-use is a TypeScript library that makes it easy to connect LangChain.js-compatible LLMs with MCP servers. Build powerful, flexible AI agents with dynamic tool access and multi-server support.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published