Abstract
Automatic emotion recognition is of great value in many applications, however, to fully display the application value of emotion recognition, more portable, non-intrusive, inexpensive technologies need to be developed. Human gaits could reflect the walker’s emotional state, and could be an information source for emotion recognition. This paper proposed a novel method to recognize emotional state through human gaits by using Microsoft Kinect, a low-cost, portable, camera-based sensor. Fifty-nine participants’ gaits under neutral state, induced anger and induced happiness were recorded by two Kinect cameras, and the original data were processed through joint selection, coordinate system transformation, sliding window gauss filtering, differential operation, and data segmentation. Features of gait patterns were extracted from 3-dimentional coordinates of 14 main body joints by Fourier transformation and Principal Component Analysis (PCA). The classifiers NaiveBayes, RandomForests, LibSVM and SMO (Sequential Minimal Optimization) were trained and evaluated, and the accuracy of recognizing anger and happiness from neutral state achieved 80.5% and 75.4%. Although the results of distinguishing angry and happiness states were not ideal in current study, it showed the feasibility of automatically recognizing emotional states from gaits, with the characteristics meeting the application requirements.
Highlights
Emotion is the mental experience with high intensity and high hedonic content (Cabanac, 2002), which deeply affects our daily behaviors by regulating individual’s motivation (Lang, Bradley & Cuthbert, 1998), social interaction (Lopes et al, 2005) and cognitive processes (Forgas, 1995)
We hypothesize that the walkers’ emotional states could be reflected in their gaits information recorded by Kinect in the form of coordinates of the main joints of body, and the states could be recognized through machine learning methods
Paired-Samples t Test showed that: for anger priming, anger ratings before priming was significantly lower than After priming I (API) (t [58] = 18.98, p < .001) and After priming II (APII) (t [58] = 14.52, p < .001); for happiness priming, happiness ratings before priming was significantly lower than API (t [58] = 10.31, p < .001) and APII (t [58] = 7.99, p < .001). These results indicated that both anger and happiness priming were successfully eliciting changes of emotional state on the corresponding dimension
Summary
Emotion is the mental experience with high intensity and high hedonic content (pleasure/displeasure) (Cabanac, 2002), which deeply affects our daily behaviors by regulating individual’s motivation (Lang, Bradley & Cuthbert, 1998), social interaction (Lopes et al, 2005) and cognitive processes (Forgas, 1995). Recognizing other’s emotion and responding adaptively to it is a basis of effective social interaction (Salovey & Mayer, 1990), and since users tend to regard computers as social agents (Pantic & Rothkrantz, 2003), they expect their affective state being sensed and taken into account while interacting with computers. As the importance of emotional intelligence for successful inter-personal interaction, the computer’s capability of recognizing automatically and responding appropriately to the user’s affective feedback had been confirmed as a crucial. The possible applications of such an emotion-sensitive system are numerous, including automatic customer services (Fragopanagos & Taylor, 2005), interactive games (Barakova & Lourens, 2010) and smart homes (Silva, Morikawa & Petra, 2012), etc. Automated emotion recognition is a very challenging task, the development of this technology would be of great value
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.