This is an MCP based AI Assistant built using Groq, LangChain, and MCP, with a Streamlit frontend and FastAPI backend. It enables chat interactions powered by large language models, enhanced with optional tools like search agents and browser automation.
- Groq LLM with LangChain and memory enabled conversation
- MCP server such as Google Search and Playwright integration
- FastAPI backend to handle chat API requests
- Streamlit frontend for real-time chat
- Utilized Docker for containerized deployment
MCP_Assistant/
├── app.py
├── main.py # FastAPI server with chat endpoint
├── streamlit_app.py # streamlit frontend UI
├── browser_mcp.json # MCP servers
├── .env # Environment variables (API keys)
├── requirements.txt # required dependencies
└── README.md # You are here!
git clone https://github.com/itsabhishekm/MCP_Assistant.git
cd MCP_Assistant
python -m venv .venv
.venv\Scripts\activate
pip install -r requirements.txt
In the .env file in the root past your groq API KEY:
GROQ_API_KEY=your_groq_api_key
If you want to add any other MCP server customize browser_mcp.json:
"mcpServers": {
"google-search": {
"command": "npx",
"args": ["-y", "@mcp-server/google-search-mcp@latest"]
},
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"]
}
}
}
Or if you don't want any MCP server, just leave it empty:
{ "mcpServers": {} }
uvicorn main:app --reload --port 8000
streamlit run streamlit_app.py
Visit: http://localhost:8501