Abstract

Man-machine interaction is an interdisciplinary field of research that provides natural and multimodal ways of interaction between humans and computers. For this purpose, the computer must understand the emotional state of the person with whom it interacts. This article proposes a novel method for detecting and classify the basic emotions like sadness, joy, anger, fear, disgust, surprise, and interest that was introduced in previous works. As with all emotion recognition systems, the approach follows the basic steps, such as: facial detection and facial feature extraction. In these steps, the contribution is expressed by using strategic face points and interprets motions as action units extracted by the FACS system. The second contribution is at the level of the classification step, where two classifiers were used: Kohonen self-organizing maps (KSOM) and multilayer perceptron (MLP) in order to obtain the best results. The obtained results show that the recognition rate of basic emotions has improved, and the running time was minimized by reducing resource use.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.