Our goal is to create the best possible chatbot UX β focusing on the joy and intuitiveness users feel when calling and interacting with AI tools.
See the experience in action in the preview below!
Built with Vercel AI SDK and Next.js, this app adopts modern patterns for building AI chat interfaces. It leverages the power of the Model Context Protocol (MCP) to seamlessly integrate external tools into your chat experience.
- MCP Client Chatbot
Get a feel for the UX β here's a quick look at what's possible.
Example: Control a web browser using Microsoft's playwright-mcp tool.
- The LLM autonomously decides how to use tools from the MCP server, calling them multiple times to complete a multi-step task and return a final message.
Sample prompt:
Please go to GitHub and visit the cgoinglove/mcp-client-chatbot project.
Then, click on the README.md file.
After that, close the browser.
Finally, tell me how to install the package.
demo-video.mov
This demo showcases a realtime voice-based chatbot assistant built with OpenAI's new Realtime API β now extended with full MCP tool integration. Talk to the assistant naturally, and watch it execute tools in real time.
Quickly call any registered MCP tool during chat by typing @toolname
.
No need to memorize β just type @
and select from the list!
You can also create tool presets by selecting only the MCP servers or tools you want. Switch between presets instantly with a click β perfect for organizing tools by task or workflow.

Control how tools are used in each chat with Tool Choice Mode β switch anytime with βP
.
- Auto: The model automatically calls tools when needed.
- Manual: The model will ask for your permission before calling a tool.
- None: Tool usage is disabled completely.
This lets you flexibly choose between autonomous, guided, or tool-free interaction depending on the situation.
β¦and there's even more waiting for you. Try it out and see what else it can do!
This project uses pnpm as the recommended package manager.
# If you don't have pnpm:
npm install -g pnpm
# 1. Install dependencies
pnpm i
# 2. Enter only the LLM PROVIDER API key(s) you want to use in the .env file at the project root.
# Example: The app works with just OPENAI_API_KEY filled in.
# (The .env file is automatically created when you run pnpm i.)
# 3. Build and start all services (including PostgreSQL) with Docker Compose
pnpm docker-compose:up
# 1. Install dependencies
pnpm i
# 2. Create the environment variable file and fill in your .env values
pnpm initial:env # This runs automatically in postinstall, so you can usually skip it.
# 3. (Optional) If you already have PostgreSQL running and .env is configured, skip this step
pnpm docker:pg
# 4. Run database migrations
pnpm db:migrate
# 5. Start the development server
pnpm dev
# 6. (Optional) Build & start for local production-like testing
pnpm build:local && pnpm start
# Use build:local for local start to ensure correct cookie settings
Open http://localhost:3000 in your browser to get started.
The pnpm i
command generates a .env
file. Add your API keys there.
# === LLM Provider API Keys ===
# You only need to enter the keys for the providers you plan to use
GOOGLE_GENERATIVE_AI_API_KEY=****
OPENAI_API_KEY=****
XAI_API_KEY=****
ANTHROPIC_API_KEY=****
OPENROUTER_API_KEY=****
OLLAMA_BASE_URL=http://localhost:11434/api
# Secret for Better Auth (generate with: npx @better-auth/cli@latest secret)
BETTER_AUTH_SECRET=****
# (Optional)
# URL for Better Auth (the URL you access the app from)
BETTER_AUTH_URL=
# === Database ===
# If you don't have PostgreSQL running locally, start it with: pnpm docker:pg
POSTGRES_URL=postgres://your_username:your_password@localhost:5432/your_database_name
# Whether to use file-based MCP config (default: false)
FILE_BASED_MCP_CONFIG=false
# (Optional)
# === OAuth Settings ===
# Fill in these values only if you want to enable Google/GitHub login
GOOGLE_CLIENT_ID=
GOOGLE_CLIENT_SECRET=
GITHUB_CLIENT_ID=
GITHUB_CLIENT_SECRET=
Step-by-step setup guides for running and configuring MCP Client Chatbot.
- How to add and configure MCP servers in your environment
- How to self-host the chatbot using Docker, including environment configuration.
- Deploy the chatbot to Vercel with simple setup steps for production use.
- Personalize your chatbot experience with custom system prompts, user preferences, and MCP tool instructions
- Configure Google and GitHub OAuth for secure user login support.
- Adding openAI like ai providers
Advanced use cases and extra capabilities that enhance your chatbot experience.
- Use MCP servers and structured project instructions to build a custom assistant that helps with specific tasks.
- Open lightweight popup chats for quick side questions or testing β separate from your main thread.
Planned features coming soon to MCP Client Chatbot:
- MCP-integrated LLM Workflow
- File Attach & Image Generation
- Collaborative Document Editing (like OpenAI Canvas: user & assistant co-editing)
- RAG (Retrieval-Augmented Generation)
- Web-based Compute (with WebContainers integration)
π‘ If you have suggestions or need specific features, please create an issue!
We welcome all contributions! Bug reports, feature ideas, code improvements β everything helps us build the best local AI assistant.
For detailed contribution guidelines, please see our Contributing Guide.
Language Translations: Help us make the chatbot accessible to more users by adding new language translations. See language.md for instructions on how to contribute translations.
Let's build it together π
Connect with the community, ask questions, and get support on our official Discord server!