Abstract

This study introduces an automatic emotion recognition system (AER) focusing on skeletal-based kinematic datasets for enhanced human–computer interaction. Departing from conventional approaches, it achieves real-time emotion recognition in real-life situations. The dataset covers seven emotions and undergoes assessment by eight diverse machine and deep learning algorithms. A thorough investigation is undertaken by varying window sizes and data states, including raw positions and feature-extracted data. The findings imply that incorporating advanced techniques like joint-related feature extraction and robust classifier models yields promising outcomes. Dataset augmentation via varying window sizes enriches insights into real-world scenarios. Evaluations exhibit classification accuracy surpassing 99% for small windows, 94% for medium, and exceeding 88% for larger windows, thereby confirming the robust nature of the approach. Furthermore, we highlight window size's impact on emotion detection and the benefits of combining coordinate axes for efficiency and accuracy. The analysis intricately examines the contributions of features at both the joint and axis levels, assisting in making well-informed selections. The study's contributions include carefully curated datasets, transparent code, and models, all of which ensure the possibility of replication. The paper establishes a benchmark that bridges theory and practicality, solidifying the proposed approach's effectiveness in balancing accuracy and efficiency. By pioneering advanced AER through kinematic data, it sets a new standard for efficacy while driving seamless human–computer interaction through rigorous analysis and strategic design.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.