This project uses YOLOv7 implemented in TensorFlow to detect and classify emotions in facial images, providing a Flask web interface for easy interaction.
- YOLOv7 model implemented in TensorFlow for emotion detection
- Trains on custom emotion dataset
- Detects 7 emotions: angry, disgusted, fearful, happy, neutral, sad, surprised
- Web interface for uploading images and getting predictions
- Visual results with bounding boxes and emotion labels
yolov7 emotion/
├── Data/ # Original emotion dataset
│ ├── train/ # Training data
│ └── test/ # Testing data
├── emotion_model/ # Trained YOLOv7-TF model (after training)
├── static/ # Static files for Flask app
│ └── uploads/ # Uploaded and result images
├── templates/ # HTML templates
│ └── index.html # Web interface
├── app.py # Flask application using YOLOv7-TF for detection
├── train_emotion.py # Script to train YOLOv7-TF on emotion dataset
└── requirements.txt # Python dependencies
pip install -r requirements.txtTrain the YOLOv7 model implemented in TensorFlow on your emotion dataset:
python train_emotion.pyYou can adjust training parameters:
python train_emotion.py --batch-size 8 --epochs 30 --img-size 416After training (or even without training, using the fallback detection):
python app.pyThen open your browser and go to http://localhost:5000
- Upload an image containing faces
- Click "Detect Emotions"
- View the detected emotions with confidence scores
- Training: The system trains a YOLOv7 model using TensorFlow on the emotion dataset
- Detection: The trained model detects faces and classifies emotions in uploaded images
- Result Visualization: The detected emotions are displayed with bounding boxes and labels
- The model is trained using YOLOv7 architecture implemented in TensorFlow
- If the trained model is not available, the system falls back to a simpler detection method
- For best results, use clear images with visible faces
- The system can detect multiple faces and emotions in a single image