Abstract

Person's emotions can be observed from their facial expressions. When a person feels certain emotions, his/her facial expression will usually change, such as wrinkles on the forehead, blinking of the eyes, or changes in the color of the facial skin. This paper describes how to use the Convolutional Neural Network (CNN) to identify a person's emotions based on facial expressions observed in facial images. CNN is part of deep learning method that is widely used to analyze information from images and often gives good results. Three different datasets, Facial Expression Recognition 2013 (FER2013), Cohn-Kanade Dataset (CK+), and Karolinska Directed Emotional Faces (KDEF), were used to build and test the CNN model. Such datasets contain some grayscale images in jpeg format and has been labeled with seven emotion classes (Anger, Disgust, Fear, Happy, Sad, Surprise, and Neutral). In order to detect the seven classes of emotions, the CNN architecture was built using four convolution layers and two fully connected layers. During testing, the highest performance was achieved by the KDEF dataset with an Accuracy of 0.82, Precision 0.84, Recall 0.82, and F1-score 0.81. The most easily recognizable emotion classes are Disgust and Happy, while the emotion class that is most difficult to recognize is Sad.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.