Skip to content

This project uses computer vision and machine learning to recognize human emotions in real-time through webcam input. It leverages MediaPipe to extract facial and body landmarks and classifies emotional states (like happy, sad, angry) using a trained model.

Notifications You must be signed in to change notification settings

AshishRana28/Emotion-Detection-via-Facial-Expressions-Body-Gestures

Repository files navigation

🎭 Emotion Detection via Facial Expressions & Body Gestures

A real-time emotion recognition system using MediaPipe and Machine Learning, capable of detecting and classifying human emotions based on facial expressions, hand gestures, and body posture.


📌 Project Overview

This project combines:

  • MediaPipe Holistic for extracting body, face, and hand landmarks
  • A trained ML model for emotion classification
  • OpenCV for real-time webcam processing and display

🎯 Goals

  • Detect facial and body landmarks in real time
  • Extract 2,130 keypoint features per frame
  • Predict human emotions like happy, sad, angry, surprised, etc.
  • Visualize both prediction and facial/pose mesh

🧠 Model Details

The model uses:

  • Pose Landmarks: 33 points × 4 features = 132
  • Face Landmarks: 468 points × 4 features = 1,872
  • Left & Right Hand: 21 × 3 × 2 = 126
    ➡️ Total: 2,130 features

Trained model is stored in body_language.pkl.


📸 Results

Here are some sample outputs from the emotion recognition system: img1 img2 img3 img4 img5 img6 img7 img8 dfd

🛠️ Requirements

Make sure you have Python 3.7+. Install dependencies using:

pip install -r requirements.txt


About

This project uses computer vision and machine learning to recognize human emotions in real-time through webcam input. It leverages MediaPipe to extract facial and body landmarks and classifies emotional states (like happy, sad, angry) using a trained model.

Resources

Stars

Watchers

Forks