Skip to content

Releases: galihru/MentalHealth

Mental Health

18 Apr 06:05
86597c9
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:55
81a0f2b
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:45
288a51d
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:40
7980de4
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:37
5412467
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:34
bbdc12e
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:29
61a9e86
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:25
e0e32d0
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:23
71ac61d
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida

Mental Health

18 Apr 05:19
805f5c7
Compare
Choose a tag to compare

Mental Health Application Based Face Recognition

Key Features:

  • Emotion Detection: Analyzes user's facial expressions to identify emotions.
  • Personalized Recommendations: Provides suggestions based on the user's emotional state.
  • Professional Integration: Notifications to contact mental health professionals if needed.

Formulation

  1. Hash Function (djb2Hash)

The hash function is used to generate a unique FaceID based on facial landmarks. The formula for the hash function is:

$$ \text{hash} = 5381 \\ \text{for each character } i \text{ in the string:} \\ \text{hash} = (\text{hash} \times 33) + \text{charCodeAt}(i) \\ \text{return hash } >> 0 \text{(unsigned 32-bit integer)} $$

  1. Lip Stretch Calculation (Happiness)

The lip stretch is calculated using the Euclidean distance between the left and right lip corners:

$$ \text{lipStretch} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

  1. Cheek Raise Calculation (Happiness)

The cheek raise is calculated as the vertical distance between the cheek and eye landmarks:

$$ \text{cheekRaise} = y_{\text{eye}} - y_{\text{cheek}} $$

  1. Lip Depression Calculation (Sadness)

The lip depression is calculated as the vertical distance between the lip corner and the bottom lip:

$$ \text{lipDepression} = y_{\text{bottomLip}} - y_{\text{lipCorner}} $$

  1. Brow Lowering Calculation (Anger)

The brow lowering is calculated as the vertical distance between the inner and outer brow landmarks:

$$ \text{browLower} = y_{\text{innerBrow}} - y_{\text{outerBrow}} $$

  1. Eye Openness Calculation (Surprise)

The eye openness is calculated as the vertical distance between the eyelid and eye landmarks:

$$ \text{eyeOpenness} = y_{\text{eye}} - y_{\text{eyelid}} $$

  1. Jaw Drop Calculation (Surprise)

The jaw drop is calculated as the vertical distance between the chin and nose landmarks:

$$ \text{jawDrop} = y_{\text{chin}} - y_{\text{nose}} $$

  1. Deviation from Neutral (Neutral Emotion)

The deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sqrt{(x_2 - x_1)^2 + (y_2 - y_1)^2} $$

The total deviation from neutral is calculated as the sum of Euclidean distances between key facial landmarks:

$$ \text{deviation} = \sum_{i=1}^{n} \sqrt{(x_{2i} - x_{1i})^2 + (y_{2i} - y_{1i})^2} $$

Technologies:

  • Face Recognition
  • Emotion Detection
  • Voice Analysis
  • IoT with Health Sensor (e.g., GSR Sensor, MAX30102 Sensor, BH1750 Sensor, and ESP32 Microcontroller)
  • Machine Learning

Usage:

  1. Open the application and allow camera and microphone access.
  2. Let the application analyze your facial expressions.
  3. Receive tailored recommendations based on your condition.

How To Use this Package

You can copy the command line below:

npm install -g @galihridhoutomo/mentalhealth

Import Modules into Project

If using CommonJS:

const EmotionDetection = require('@galihridhoutomo/mentalhealth');

or If using ES Module (ESM):

import EmotionDetection from '@galihridhoutomo/mentalhealth';

Detecting Emotion from Face Images

Use the detectEmotion(imagePath) function to detect emotions from facial images:

EmotionDetection.detectEmotion('path/to/image.jpg')
.then(result => {
  console.log('Emotion Detection Result:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Sample Output:

{
  "emotion": "happy",
  "confidence": 0.92
}

Detecting Emotion from Camera in Real-Time

If you want to detect emotions from the camera, use the detectEmotionLive() function:

EmotionDetection.detectEmotionLive()
.then(result => {
  console.log('Detected Emotion:', result);
})
.catch(error => {
  console.error('Error:', error);
});

Additional Configuration (Optional)

You can customize the detection model with the following options:

const options = {
  model: 'advanced',  # Can be 'basic' or 'advanced'
  threshold: 0.8      # Minimum confidence threshold
};

EmotionDetection.detectEmotion('path/to/image.jpg', options)
  .then(result => console.log(result))
  .catch(error => console.error(error));

Cite

If you are using this GitHub Repository, please cite it in the following format:

@misc{mentalhealth-app,
  author = {Utomo, Galih Ridho and Maulida, Ana},
  title = {Mental Health Application with Face Recognition and Emotion Detection},
  year = {2025},
  howpublished = {\url{https://github.com/4211421036/MentalHealth}},
  note = {GitHub repository},
}

Authors

  1. Galih Ridho Utomo
  2. Ana Maulida