Abstract

Automatic emotion detection is a key task in human machine interaction,where emotion detection makes system more natural. In this paper, we propose an emotion detection using deep learning algorithm. The proposed algorithm uses end to end CNN. To increase computational efficiency of the deep network, we make use of trained weight parameters of the MobileNet to initialize the weight parameters of our system. To make our system independent of the input image size, we place global average pooling layer On top of the last convolution layer of it. Proposed system is validated for emotion detection using two benchmark datasets viz. Cohn–Kanade+ (CK+) and Japanese female facial expression (JAFFE). The experimental results show that the proposed method outperforms the other existing methods for emotion detection.

Highlights

  • Through outside agents such as speech, gestures, and facial expressions, individuals communicate their real intent and emotions

  • We propose an end-to-end convolution neural network named as ENet for emotion detection from images

  • In this paper, we have proposed an emotion recognition network named as ENet

Read more

Summary

Introduction

Through outside agents such as speech, gestures, and facial expressions, individuals communicate their real intent and emotions. Automated Facial Expression Recognition (FER) is a non-intrusive approach to the analysis of human affective behavioral patterns. The FER device plays a key role in human–computer interaction, tracking, deception or lie detection, behavioral profiling, and healthcare applications. This study witnessed the characterization of emotions is approximately same across the globe. They categorized the human emotions into anger, sad, fear, happy, disgust, and surprise. A traditional emotion detection approach includes (1) image acquisition (2) image pre-processing (3) feature extraction and (4) classification (emotion detection). Accuracy of such traditional emotion detection system is depends upon the robustness of feature extraction and classification stage

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.