Abstract

Machine analysis of facial emotion recognition is a challenging and innovative research topic in Human-Computer Intelligent Interaction (HCII) nowadays. The eye and mouth regions are the most important components for facial emotion recognition. Most of the existing approaches have not utilized the eye and mouth temporal features for high recognition rate. This paper proposes a novel approach for recognizing the facial emotions using eye and mouth temporal features with high recognition rate. The local features are extracted in each frame by using Gabor Wavelet with selected scale and orientations. This feature is passed to ensemble classifier for detecting the location of face region. From the signature of the face region, the eye and the mouth regions are detected using ensemble classifier. Blocks of temporal features are extracted from the signature of the eye and the mouth regions in the consecutive frames. In each block, the eye and mouth temporal features are normalized by Z-score normalization technique and encoded into binary pattern features. Concatenate the eye and mouth encoded temporal features to generate the enhanced temporal feature. Multi-class Adaboost is used to select and classify the discriminative temporal features for recognizing the facial emotion. The developed methods are deployed on the RML and CK databases, and they exhibit significant performance improvement owing to their temporal features when compared with the existing techniques.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.