Abstract

Emotion recognition from gait has gained significant interest due to its applicability in different fields such as healthcare, social cues, surveillance, and smart applications. Gait, as a biometric trait, offers unique advantages, allowing remote identification and robust recognition even in uncontrolled scenarios. Moreover, gait analysis can provide valuable insights into an individual’s emotional state. This work presents the “Walk-as-you-Feel” (WayF) framework, a novel approach for gait-based emotion recognition that does not rely on facial cues, ensuring user privacy. To address challenges with small and unbalanced datasets, a balancing procedure suitable for deep learning architecture is also developed. Adapted Inception-v3 and EfficientNet are employed for the feature extraction phase. Classification is performed using a Gated Recurrent Units network (GRUs) and Transformers-Encoder. Experimental results demonstrate the competitiveness of the proposed approach with respect to state-of-the-art works which also integrate facial cues. WayF reaches an average recognition rate of approximately 77% in its best configuration. Moreover, when excluding the neutral emotion, the proposed method achieves an outstanding overall accuracy of 83.3%.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.