A sophisticated multi-agent chat system that brings together four unique AI personalities in a dynamic group conversation. Built with CrewAI and LangGraph, this project demonstrates the power of collaborative AI with specialized agents working together to create engaging discussions.
AI Hangout allows you to chat with four distinct AI personalities simultaneously:
- 🖥️ Tech Enthusiast: Passionate about technology and innovation
- 🎬 Film Critic: Expert in cinema and storytelling
- 💪 Fitness Coach: Focused on health and wellness
- 🧠 Philosopher: Asks thought-provoking questions
Each agent has their own expertise, communication style, and perspective, creating rich, multi-faceted conversations on any topic.
For a detailed explanation of the concepts behind this project, check out my Medium article: Digital Water Cooler Talk: Creating AI Friends That Communicate Using CrewAI and LangGraph
- Multi-Agent Collaboration: Four distinct AI agents with unique personalities and expertise
- Dynamic Conversations: Real-time responses from all agents
- Web Interface: Clean, modern UI for easy interaction
- API Backend: RESTful API for extensibility
- Docker Support: Easy deployment and consistent environments
- Persistent Storage: Conversation history saved locally
- Backend: Python, CrewAI, LangGraph
- Frontend: Flask, HTML, CSS, JavaScript
- LLM Provider: Groq (Llama 3)
- Deployment: Docker, Docker Compose
- Python 3.10+
- Docker and Docker Compose (for containerized deployment)
- Groq API key
-
Clone the repository:
git clone https://github.com/Ibzie/AI_Crew_Chatroom.git cd ai-hangout
-
Create a virtual environment and activate it:
python -m venv venv source venv/bin/activate # On Windows: venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt pip install flask
-
Create a
.env
file with your Groq API key:GROQ_API_KEY=your_api_key_here
-
Start the backend API:
python main.py --mode api
-
In a new terminal, start the Flask frontend:
cd frontend python app.py
-
Open your browser and navigate to
http://localhost:5000
-
Build and run with Docker Compose:
docker-compose build docker-compose up
-
Access the application at
http://localhost:5000
ai_hangout/
├── agents/ # AI agent configurations
├── config/ # Configuration settings
├── frontend/ # Flask web interface
│ ├── app.py # Flask application
│ ├── static/ # CSS and JavaScript
│ └── templates/ # HTML templates
├── models/ # Data models
├── services/ # External service integrations
├── utils/ # Utility functions
├── workflows/ # LangGraph workflows
├── docker-compose.yml # Docker configuration
├── Dockerfile # Container definition
├── main.py # Main application entry
└── requirements.txt # Python dependencies
- Enter a topic to discuss
- All four AI agents will respond with their unique perspectives
- Continue the conversation by sending your own messages
- Each agent maintains their personality throughout the discussion
POST /api/start-conversation
: Start a new conversationGET /api/continue-conversation/{id}
: Get agent responsesPOST /api/add-message/{id}
: Add a user message
Key settings can be adjusted in config/settings.py
:
DEFAULT_MODEL
: The Groq model to use (default: llama3-8b-8192)MAX_ITERATIONS
: Maximum conversation turns per roundDEFAULT_TEMPERATURE
: Controls response creativity
Contributions are welcome! Please feel free to submit a Pull Request.
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
- CrewAI for the multi-agent framework
- LangGraph for conversation flow management
- Groq for fast LLM inference
Ibrahim Akhtar
- Medium: @Ibzie
- LinkedIn: Ibrahim Akhtar
- GitHub: @Ibzie
If you find this project helpful, please give it a ⭐️ on GitHub!
For questions or support, please open an issue in the repository.
Article count since unemployment: 2 🚀