Abstract

Emotions play a significant role in human-computer interaction and entertainment consumption behavior, which young adults commonly use. The main challenge is the lack of a publicly available dataset for young adults with emotion labeling of physiological signals. This article presents a multi-modal data set of Electrocardiograms (ECG) and Galvanic Skin Response (GSR) signals for the emotion classification of young adults. Signal acquisition was performed through Shimmer3 ECG and Shimmer3 GSR units wearable to the chest and palm of the participants. The ECG signals were acquired from 25 participants, while GSR signals were acquired from 12 participants while watching 21 emotional stimulus videos divided into three sessions. The data was self-annotated for seven emotions: happy, sad, fear, surprise, anger, disgust, and neutral. These emotional states were further self-annotated with five very low, low, moderate, high, and very high-intensity levels of felt emotion. The participant also annotated valence, arousal, and dominance scores through Google form against each provided stimulus. The base experimental results for classifying four classes of high valence high arousal (HVHA), high valence low arousal (HVLA), low valence high arousal (LVHA), and low valence low arousal for ECG data is reported with an accuracy of 69.66%. Our baseline method for the proposed dataset achieved 66.64% accuracy for the eight-class classification of categorical emotions. The significance of data lies in the more emotional classes and less intrusive sensors to mimic real-world applications. Young adult’s affective data (YAAD) is made publicly available, and it is valuable for researchers to develop behavioral assessments based on physiological signals.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call