A simple and powerful command-line chatbot built with the Groq API using the LLaMA 3.3–70B Versatile model. This project allows you to have real-time, intelligent conversations with a large language model directly from your terminal.
- ⚡ Powered by
llama-3.3-70b-versatile
from Groq - 🧠 Context-aware assistant replies
- 🔐 Loads API key securely from
.env
- 💬 Continuous chat loop via terminal
- 🛠 Minimal and extendable code structure
git clone https://github.com/yourusername/groq-cli-chatbot.git
cd groq-cli-chatbot
pip install groq python-dotenv
Create a .env
file in the root directory and add your Groq API key:
GROQ_API_KEY=your_groq_api_key_here
python chatbot.py
groq-cli-chatbot/
│
├── chatbot.py # Main chatbot script
├── .env # API key (not committed)
├── README.md # Project documentation
└── requirements.txt # Optional: dependencies list
Enter Your Query (or type 'exit' to quit): What is the capital of Japan?
Assistant: The capital of Japan is Tokyo.
Enter Your Query (or type 'exit' to quit): exit
Goodbye!
This project is proprietary and confidential. All rights reserved.
© 2025 HUSSAIN ALI. This code may not be copied, modified, distributed, or used without explicit permission.
For questions or collaboration requests:
- 📧 Email: choudaryhussainali@outlook.com
- 🌐 GitHub: choudaryhussainali
✨ Built using Groq and the blazing-fast LLaMA models