Abstract
We present a new method for human facial emotions recognition. For this purpose, initially, we detect faces in the images by using the famous cascade classifiers. Subsequently, we then extract a localized regional descriptor (LRD) which represents the features of a face based on regional appearance encoding. The LRD formulates and models various spatial regional patterns based on the relationships between local areas themselves instead of considering only raw and unprocessed intensity features of an image. To classify facial emotions into various classes of facial emotions, we train a multiclass support vector machine (M-SVM) classifier which recognizes these emotions during the testing stage. Our proposed method takes into account robust features and is independent of gender and facial skin color for emotion recognition. Moreover, our method is illumination and orientation invariant. We assessed our method on two benchmark datasets and compared it with four reference methods. Our proposed method outperformed them considering both the datasets.
Highlights
Classification of emotion in different classes is a field of significant attention nowadays
The most important of this field is related to human facial emotion classification which is demonstrated as a chain procedure to recognize various human emotions via facial skin expressions, verbal expressions, different gesture and body movements, and different physiological signals measurement methods
The dataset namely static facial expressions in the wild (SFEW) has been collected by choosing frames from AFEW part of the collection which is popular among the community of facial emotion recognition
Summary
Classification of emotion in different classes is a field of significant attention nowadays. The most important of this field is related to human facial emotion classification which is demonstrated as a chain procedure to recognize various human emotions via facial skin expressions (shown in Fig. 1), verbal expressions, different gesture and body movements, and different physiological signals measurement methods. In addition to what has been mentioned above, facial emotion recognition plays very important role in finding various mental health conditions by doctors, psychiatrists and psychologists. In the past few decades, scientists and researchers from multidisciplinary fields have proposed different approaches and methods to identify emotions from facial features, speech signals, and many other sources.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Advanced Computer Science and Applications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.