Facial Expression Analysis for Distress Detection

Abstract
Emotions are an incredibly important aspect of human life and basic research on emotions of the past few decades has produced several discoveries that have led to important real world applications. Facial expressions project our true emotions to others and add the real intent to the words we say. The interpretation of such facial expressions exhibited by the subject in response to a situation is really useful for many applications in fields of medicine, E-learning, entertainment, monitoring, marketing, law and many more. This project focuses on determining the distress level of a person by analyzing his facial expressions. The reaction of a person to a particular communication scenario is recorded using a video or still camera under predefined lighting conditions and this input is taken and processed further to detect his emotion. The face and facial landmarks detection are done using Viola Jones algorithm. Facial patches active during an emotion elicitation are then extracted for texture analysis. The feature extraction method used here is Gray Level Difference Method (GLDM) in which texture features are derived from the GLDM probability density functions. The next step, classification is done using Naïve Bayes Classifier. With reference to the trained information, the emotion of the person is recognized and is used for determining his distress level. The proposed system is tested using Extended Cohn Kanade(CK+) and Japanese female facial expression (JAFFE) datasets.

This publication has 9 references indexed in Scilit: