This project is a real-time hand gesture recognition system using MediaPipe, OpenCV, and a TensorFlow/Keras deep learning model. It allows users to perform various hand gestures that are detected live via webcam.
- ✅ Real-time hand gesture detection using webcam
- 🧠 Custom-trained model using TensorFlow/Keras
- 📷 MediaPipe for accurate hand landmark tracking
- 🎯 Confidence threshold and cooldown logic for stable predictions
- 🔁 Support for multiple gestures including:
- Hello 👋
- Fist ✊
- Thumbs Up 👍
- Thumbs Down 👎
- Peace ✌️
- Stop ✋
- Okay 👌
- Rock 🤘
- Call Me 🤙
- Python
- OpenCV
- MediaPipe
- TensorFlow / Keras
- NumPy
MULTI-GESTURE-DETECTOR/
│
├── model.keras # Trained gesture classification model
├── gestures/ # Folder containing .csv data for each gesture
├── gestures.txt # List of gesture labels
├── collect_data.py # Script to record gesture data
├── train_model.py # Script to train the model
├── predict_gesture.py # Main script for real-time prediction
└── README.md # This file
-
Install Dependencies
pip install opencv-python mediapipe numpy tensorflow
-
Run Real-Time Prediction
python predict_gesture.py
-
Press
q
to quit the live window
To train your own model:
- Record data using
collect_data.py
- Train the model using
train_model.py
- Ensure the model is saved as
model.keras
- 🔥 Two-hand gesture detection
- 🌐 Web-based UI with Flask
- 💾 Save gesture history/log
- 🧠 Integrate voice feedback or system control
This project is licensed under the MIT License.
Vansh Agrawal