Abstract

BackgroundResearch in psychology has shown that the way a person walks reflects that person’s current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data.ObjectiveThe objective of our study was to investigate the use of movement sensor data from a smart watch to infer an individual’s emotional state. We present our findings of a user study with 50 participants.MethodsThe experimental design is a mixed-design study: within-subjects (emotions: happy, sad, and neutral) and between-subjects (stimulus type: audiovisual “movie clips” and audio “music clips”). Each participant experienced both emotions in a single stimulus type. All participants walked 250 m while wearing a smart watch on one wrist and a heart rate monitor strap on the chest. They also had to answer a short questionnaire (20 items; Positive Affect and Negative Affect Schedule, PANAS) before and after experiencing each emotion. The data obtained from the heart rate monitor served as supplementary information to our data. We performed time series analysis on data from the smart watch and a t test on questionnaire items to measure the change in emotional state. Heart rate data was analyzed using one-way analysis of variance. We extracted features from the time series using sliding windows and used features to train and validate classifiers that determined an individual’s emotion.ResultsOverall, 50 young adults participated in our study; of them, 49 were included for the affective PANAS questionnaire and 44 for the feature extraction and building of personal models. Participants reported feeling less negative affect after watching sad videos or after listening to sad music, P<.006. For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness.ConclusionsOur findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch. Together with high accuracies achieved across all users for classification of happy versus sad emotional states, this is further evidence for the hypothesis that movement sensor data can be used for emotion recognition.

Highlights

  • Our emotional state is often expressed in a variety of means, such as face, voice, body posture, and walking gait [1,2]

  • For the task of emotion recognition using classifiers, our results showed that personal models outperformed personal baselines and achieved median accuracies higher than 78% for all conditions of the design study for binary classification of happiness versus sadness

  • Our findings show that we are able to detect changes in the emotional state as well as in behavioral responses with data obtained from the smartwatch

Read more

Summary

Introduction

Our emotional state is often expressed in a variety of means, such as face, voice, body posture, and walking gait [1,2]. Video, and physiological data have been analyzed to determine the emotional state of a person [3,4], but these analyses usually rely on recordings obtained in laboratory environments with limited ecological validity. Mobile phones include sensors, such as accelerometers, that have the potential to be sensitive to changes in people’s affective states and could provide rich and accessible information in this respect; for example, we know that the way we walk reflects whether we feel happy or sad [2]. This paper analyzes movement sensor data recorded via a smart watch in relation to changes in emotions. Research in psychology has shown that the way a person walks reflects that person’s current mood (or emotional state). Recent studies have used mobile phones to detect emotional states from movement data

Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call