Abstract

Over the past two decades, automatic facial emotion recognition has received enormous attention. This is due to the increase in the need for behavioral biometric systems and human–machine interaction where the facial emotion recognition and the intensity of emotion play vital roles. The existing works usually do not encode the intensity of the observed facial emotion and even less involve modeling the multi-class facial behavior data jointly. Our work involves recognizing the emotion along with the respective intensities of those emotions. The algorithms used in this comparative study are Gabor filters, a Histogram of Oriented Gradients (HOG), and Local Binary Pattern (LBP) for feature extraction. For classification, we have used Support Vector Machine (SVM), Random Forest (RF), and Nearest Neighbor Algorithm (kNN). This attains emotion recognition and intensity estimation of each recognized emotion. This is a comparative study of classifiers used for facial emotion recognition along with the intensity estimation of those emotions for databases. The results verified that the comparative study could be further used in real-time behavioral facial emotion and intensity of emotion recognition.

Highlights

  • The dual fears of identity theft and password hacking are becoming a reality, where the only hope of a secure method for preserving data are behavioral systems

  • Any level of biometrics could not be performed without good sensors, and when it comes to facial emotion intensity recognition, apart from high-quality sensors, there is a need for efficient algorithms to recognize emotional intensity in real time

  • Not shown in the table, but we would like to mention that the accuracy increased from 68.32% to 71.95% for Random Forest when the Action Unit (AU) dependency relationship was used while extracting Histogram of Oriented Gradients (HOG) features

Read more

Summary

Introduction

The dual fears of identity theft and password hacking are becoming a reality, where the only hope of a secure method for preserving data are behavioral systems. The criterion for the accuracy of intensity detection of the five observed basic emotions (and a neutral expression) is based on the analysis of the facial behavior components that are relevant to emotional intensity communication [8]. This involves detecting the face and recognizing the intensity of emotion depicted, both of which could be achieved using classifiers assisted by a training set. A comparative study and implementation of algorithms for measuring facial emotions and their intensities based on the different AUs (Action Units) are presented.

Literature Review
10 Volunteers
Methodology
AU Intensity Feature Extraction and Correlation Analysis
Face Registration and Representation
Feature Extraction through Gabor Features
Local Binary Pattern Method
Histogram of Oriented Gradient Features
Dimensionality Reduction
Classification
Databases Considered
Recognition and Reliability Measures
Result Analysis Based on Intensity of Emotions
Conclusions and Future Work
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call