Abstract

Facial emotion recognition is an emerging field which use in many nowadays application including social robots, neuromarketing and games. Non-verbal communication methods like facial expressions, eye movement and gestures are used in many application of human computer interaction, which among them facial emotion is widely used because it convey the emotional states and feelings of persons. The emotion recognition is not an easy task because there is no landmark distinction between the emotions on the face and also there are a lot of complexity and variability. In the traditional machine learning algorithm some important extracted features used for modeling the face, so, it can not achieve high accuracy rate for recognition of emotion because the features are hand-engineered and depend on prior knowledge. Convolutional neural networks (CNN) have developed in this work for recognition facial emotion expression and classify them into seven basic categories. Instead of calculating hand-engineered features, CNN calculates features by learning automatically. The novelty of the proposed method is using facial action units (AUs) of the face which first these units are recognized by C NN and incorporate to recognizing the seven basic emotion states. To evaluated the proposed model, Cohn-Kanade database is used so that the model achieves the best accuracy rate 97.01 by incorporating AU while other works in the literature used a direct CNN and achieve accuracy rate 95.75.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call