- Emotion Detection: Analyzes user's facial expressions to identify emotions.
- Personalized Recommendations: Provides suggestions based on the user's emotional state.
- Professional Integration: Notifications to contact mental health professionals if needed.
- Hash Function (djb2Hash)
The hash function is used to generate a unique FaceID
based on facial landmarks. The formula for the hash function is:
- Lip Stretch Calculation (Happiness)
The lip stretch is calculated using the Euclidean distance between the left and right lip corners:
- Cheek Raise Calculation (Happiness)
The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:
- Lip Depression Calculation (Sadness)
The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:
- Brow Lowering Calculation (Anger)
The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:
- Eye Openness Calculation (Surprise)
The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:
- Jaw Drop Calculation (Surprise)
The jaw drop is calculated as the vertical distance between the chin and nose landmarks:
- Deviation from Neutral (Neutral Emotion)
The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:
The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:
- Face Recognition
- Emotion Detection
- Voice Analysis
- IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
- Machine Learning
- Open the application and allow camera and microphone access.
- Let the application analyze your facial expressions.
- Receive tailored recommendations based on your condition.
You can copy the command line below:
npm install -g @galihridhoutomo/mentalhealth
If using CommonJS
:
const EmotionDetection = require('@galihridhoutomo/mentalhealth');
or If using ES Module (ESM)
:
import EmotionDetection from '@galihridhoutomo/mentalhealth';
Use the detectEmotion(imagePath)
function to detect emotions from facial images:
EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
console.log('Emotion Detection Result:', result);
})
.catch(error => {
console.error('Error:', error);
});
Sample Output:
{
"emotion": "happy",
"confidence": 0.92
}
If you want to detect emotions from the camera, use the detectEmotionLive()
function:
EmotionDetection.detectEmotionLive()
.then(result => {
console.log('Detected Emotion:', result);
})
.catch(error => {
console.error('Error:', error);
});
You can customize the detection model with the following options:
const options = {
model: 'advanced', # Can be 'basic' or 'advanced'
threshold: 0.8 # Minimum confidence threshold
};
EmotionDetection.detectEmotion('path/to/image.jpg', options)
.then(result => console.log(result))
.catch(error => console.error(error));
If you are using this GitHub Repository, please cite it in the following format:
@misc{mentalhealth-app,
author = {Utomo, Galih Ridho and Maulida, Ana},
title = {Mental Health Application with Face Recognition and Emotion Detection},
year = {2025},
howpublished = {\url{https://github.com/4211421036/MentalHealth}},
note = {GitHub repository},
}
- Galih Ridho Utomo
- Ana Maulida