The rapid advancement in wearable physiological measurement technology in recent years has brought affective computing closer to everyday life scenarios. Recognizing affective states in daily contexts holds significant potential for applications in human–computer interaction and psychiatry. Addressing the challenge of long-term, multi-modal physiological data in everyday settings, this study introduces a Transformer-based algorithm for affective state recognition, designed to fully exploit the temporal characteristics of signals and the interrelationships between different modalities. Utilizing the DAPPER dataset, which comprises continuous 5-day wrist-worn recordings of heart rate, skin conductance, and tri-axial acceleration from 88 subjects, our Transformer-based model achieved an average binary classification accuracy of 71.5% for self-reported positive or negative affective state sampled at random moments during daily data collection, and 60.29% and 61.55% for the five-class classification based on valence and arousal scores. The results of this study demonstrate the feasibility of applying affective state recognition based on wearable multi-modal physiological signals in everyday contexts.
Read full abstract