Abstract

AbstractThe remarkable development of human–computer interactions has created an urgent need for machines to be able to recognise human emotions. Human motions play a key role in emphasising and conveying emotions to meet the complexity of daily application scenarios, such as medical rehabilitation and social education. Therefore, this paper aims to explore hidden emotional states from human motions. Accordingly, we proposed a novel approach for emotion recognition using multiple inertial measurement unit (IMU) sensors worn on different body parts. First, the mapping relationship between emotion and human motion was established through fuzzy comprehensive evaluation, and data were collected for six emotional states: sleepy, bored, excited, tense, angry, and distressed. Second, the preprocessed data were used as input in a lightweight convolutional neural network to extract discriminative features. Third, an attention‐based sensor fusion module was developed to obtain the importance scores of each IMU sensor for generating a fused feature representation. In the recognition phase, we constructed a weighted kernel support vector machine (SVM) model with an auxiliary fuzzy function to improve the weight calculation method of kernel functions in a multiple kernel SVM. Finally, the results obtained are compared with those of similar state‐of‐the‐art studies, the proposed method showed a higher accuracy (99.02%) for the six emotional states mentioned above. These findings may promote the development of social robots with non‐verbal emotional communication capabilities.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.