A deep learning-based system to detect human emotions from facial expressions using CNNs, trained on the FER-2013 dataset.
This project implements a Convolutional Neural Network (CNN) using TensorFlow/Keras to recognize facial expressions in real-time or from static images. It supports 7 key emotions:
- 😄 Happy
- 😠 Angry
- 😢 Sad
- 😮 Surprise
- 😐 Neutral
- 😨 Fear
- 🤢 Disgust
- Real-time facial emotion detection using OpenCV
- Model trained on the FER-2013 dataset
- Simple and clean GUI for image-based emotion detection
- CNN with high accuracy on validation/test sets
- Easy to extend and integrate into other applications
The CNN is built using the following structure:
- 3 Convolutional layers
- 2 MaxPooling layers
- Dropout layers for regularization
- Fully connected Dense layers
- Output layer with softmax activation for 7 classes
Input -> Conv2D -> MaxPooling -> Conv2D -> MaxPooling -> Conv2D -> Flatten -> Dense -> Output
FER-2013 (Facial Expression Recognition 2013)
- Source: Kaggle
- 35,887 grayscale images (48x48)
- 7 emotions labeled
- Split: 28,709 training / 3,589 validation / 3,589 test
Install dependencies from requirements.txt
:
pip install -r requirements.txt
// Or manually:
pip install tensorflow keras opencv-python matplotlib numpy pandas
🧪 Training the Model
python trainmodel.py
This trains the CNN model on the FER-2013 dataset.
python emotiondetector.py
This runs the model on live webcam feed or test images.