Abstract

Affect recognition is an interdisciplinary research field bringing together researchers from natural and social sciences. Affect recognition research aims to detect the affective state of a person based on observables, with the goal to, for example, provide reasoning for the person’s decision making or to support mental wellbeing (e.g., stress monitoring). Recently, beside of approaches based on audio, visual or text information, solutions relying on wearable sensors as observables, recording mainly physiological and inertial parameters, have received increasing attention. Wearable systems enable an ideal platform for long-term affect recognition applications due to their rich functionality and form factor, while providing valuable insights during everyday life through integrated sensors. However, existing literature surveys lack a comprehensive overview of state-of-the-art research in wearable-based affect recognition. Therefore, the aim of this paper is to provide a broad overview and in-depth understanding of the theoretical background, methods and best practices of wearable affect and stress recognition. Following a summary of different psychological models, we detail the influence of affective states on the human physiology and the sensors commonly employed to measure physiological changes. Then, we outline lab protocols eliciting affective states and provide guidelines for ground truth generation in field studies. We also describe the standard data processing chain and review common approaches related to the preprocessing, feature extraction and classification steps. By providing a comprehensive summary of the state-of-the-art and guidelines to various aspects, we would like to enable other researchers in the field to conduct and evaluate user studies and develop wearable systems.

Highlights

  • Affect recognition aspires to detect the affective state of a person based on observables

  • We focus on approaches utilising wearable sensors

  • We present datasets which meet one of the following criteria: (a) being publicly available; (b) including data recorded from study participants being subject either to emotional stimuli or a stressor; and (c) including at least a few sensor modalities which can be integrated into consumer-grade wearables, which are applicable in everyday life

Read more

Summary

Introduction

Affect recognition aspires to detect the affective state (e.g., emotion or stress) of a person based on observables. We focus on approaches utilising wearable sensors (recording mainly physiological and inertial parameters). A clear goal of affect recognition systems is to be applicable in everyday life Such wearable-based affect recognition could, for instance, provide users with data driven insights into their affective spectrum by linking certain states (e.g., stress) to locations (e.g., office). Due to their computational power and integrated sensors, wearable devices are ideal platforms for many applications, e.g., counting steps, or estimating burned calories and recently a first generation of affect (e.g., stress) recognition systems entered in this sector [15].

Working Definitions of Affective Phenomena
Emotion Models
Stress Models
Affective States and Their Physiological Indicators
Frequently Employed Sensors
Cardiac Activity
Electrodermal Activity
Electromyogram
Respiration
Skin-Temperature
Electroencephalogram and Electrooculography
Inertial Sensors
Context
Affect-Related User Studies
Affect-Related User Studies in Laboratory Settings
Affect-Related User Studies in The Field
Guidelines for Ecological-Momentary-Assessment
Publicly Available Datasets
Data Processing and Classification
Preprocessing and Segmentation
Physiological Feature Extraction
ACC-based Feature
ECG- and PPG-based Features
EDA-Based Features
EMG-Based Features
Respiration-Based Features
Temperature-Based Features
Classification
Findings
Discussion And Outlook

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.