Automated User State Detection Project
Welcome to the MC Quality project! This repository contains the code and implementation of a real-time emotion detection tool based on video data, developed as part of our software quality project in corporation with the DLR Institute of Software Technology. π
- Project Overview
- Project Structure
- Getting Started
- Usage
- Documentation
- Contributing
- Contributors
- License
The MC Quality project explores state-of-the-art tools and frameworks for user state detection using video data, with a focus on real-time emotion recognition. After systematically reviewing existing technologies, we selected a promising tool to build a prototype that demonstrates its potential in the field of UX and satisfaction for a specific use-case.
Prototype Implementation: Demonstration of the feasibility and potential of the selected solution.
Selected Tool: The prototype leverages the DeepFace python library, a lightweight face recognition and facial attribute analysis framework.
- Static Emotion Detection: Allows to upload an image of a face and detect the emotions expressed in the image.
- Real-Time Emotion Detection: A functional prototype that detects user emotions from video streams in real time.
- Distribution of Dominant Emotions: Bar chart showing the percentage of dominant emotions detected.
- Emotion Intensity over Time: Line chart showing the intensity of emotions over time.
- Supported Emotions: Detects the following emotions: neutral, happy, fear, surprise, angry, sad, and disgust
The codebase consists of three main Python scripts:
-
src/main.py
This script serves as the central graphical user interface (GUI) usingTkinter
. It allows users to:- Select between static and real-time emotion detection
- Execute the respective detection scripts dynamically
- Visualize emotion detection results using a bar chart and a line chart
- Manage the dataset by clearing and processing detected emotions.
-
src/detection/emotion_detection_static.py
This script enables static image-based emotion detection. It:- Allows users to upload an image for analysis
- Extracts emotions from the detected face in the image
- Displays the dominant emotion and a probability distribution as a bar chart
-
src/detection/emotion_detection_realtime.py
This script implements a real-time emotion detection system using video input from a webcam. It:- Continuously detects faces and analyzes their emotions
- Displays the detected emotions in real-time over the video feed
- Saves the detected emotions temporarily to a CSV file for further visualization in the GUI
src/results/emotions_results.csv
- This file stores the detected emotions from the real-time analysis
- It logs timestamps, dominant emotions, and probabilities for each detected emotion
- The data is used to generate the bar chart and the line chart for visualization in the GUI
- When real-time detection starts, the file is reset to store only the latest session data
Prerequisite: Make sure you have Python installed before proceeding.
During Python installation, make sure to check the box for "td/tk and IDLE"
to ensure that Tkinter is installed. Tkinter is required for the graphical components of this prototype.
If you encounter problems, try using Python 3.11.3 (recommended) or 3.11.9, as these versions are confirmed to work.
To install the prototype, follow these steps:
- Clone this repository:
git clone https://github.com/Marbru35/MCQuality.git cd MCQuality
- Setup a virtual environment
- On Windows:
python -m venv env env\Scripts\activate
- On macOS/Linux:
python3 -m venv env source env/bin/activate
- Install required dependencies:
pip install -r requirements.txt
This step takes some time! Once everything has finished downloading, you can follow the steps in Usage to start the prototype.
To use the application, start by running the main.py
script. This script launches a graphical user interface (GUI) that allows users to select between Static Emotion Analysis and Real-Time Emotion Detection modes.
- Run the GUI:
- Open a terminal or command prompt.
- Navigate to the project directory
src/
wheremain.py
is located. - Run the following command:
python main.py
- Otherwise you can directly start the
main.py
from the root directory by:python src/main.py
- This will launch the GUI, where you can select either Static Emotion Analysis or Real-Time Emotion Detection.
πNote: After selecting a mode, the main GUI will close, and a new window will open. This process may take some time, especially on the first run!
The Static Emotion Analysis feature allows users to upload an image of a face and detect the emotions expressed in the image.
- Upload an Image: Users can upload an image file in formats like
.jpg
,.jpeg
, or.png
using the intuitive interface. - Emotion Detection: The application processes the image using the DeepFace library to detect the dominant emotion and calculate the probabilities of other emotions.
- Visualization:
- Displays the uploaded image in the interface for easy reference.
- Generates a bar chart showing the percentage likelihood for each detected emotion.
- Click the "Upload Image" button in the GUI.
- Select an image from your device.
- View the dominant emotion result and the corresponding bar chart in the application.
The Real-Time Emotion Detection feature uses video input from the webcam to detect emotions in real-time. The dominant emotion is displayed on the GUI, and both the dominant emotion and the intensity of all detected emotions are saved for further analysis.
- Real-Time Video Capture: The webcam feed is processed frame by frame.
- Face Detection: Uses a Haar Cascade classifier to detect faces within each frame.
- Emotion Analysis:
- Applies the DeepFace library to analyze facial expressions and detect emotions.
- Extracts both the dominant emotion and the intensity levels for all emotions.
- Visualization:
- Highlights detected faces in the video feed with bounding boxes.
- Displays the dominant emotion as a label on the video feed.
- Data Logging:
- Captures emotion data along with timestamps.
- Saves results into a CSV file (
src/results/emotions_results.csv
) for graphical analyses with the following structure.- time: Timestamp of the detected emotion
- dominant_emotion: The most prominent emotion detected in the frame
- angry, disgust, fear, happy, sad, surprise, neutral: Intensity level for each emotion
- Select "Real-Time Emotion Detection" in the GUI.
- Allow the application to access your webcam.
- View the live video feed with real-time dominant emotion detection result.
- Stop the detection using the "Exit" button to save the results to a CSV file.
- View the distribution of the dominant emotions and the intensity of the emotions over time from the results of the CSV files in the corresponding bar chart and line graph.
- Report: Detailed analysis and outcomes of the project.
- Presentation: Overview of the project in slide format.
We warmly welcome contributions to the MC Quality project! Here are some ways to get involved:
- Report bugs or suggest features by opening an issue.
- Submit a pull request with your changes or improvements.
- Follow the REUSE Specification when contributing to ensure proper license compliance.
For detailed guidelines, please check our CONTRIBUTING.md.
If you have any questions or need guidance, feel free to contact us. We look forward to hearing from you! π
This project is part of a university group project created for the course Software Quality (SQ) by the following contributors:
- Carlotta May
π§ cmay4@smail.uni-koeln.de - Marlon Spiess
π§ mspiess1@smail.uni-koeln.de
This project is licensed under the MIT License. You are free to use, modify, and distribute this software under the conditions stated in the LICENSE file.
Have fun exploring our prototype!