Abstract

AbstractA technical companion system should be able to detect its users emotion and model the user's emotional state in order to react to it accordingly. We have developed a novel method to determine a user's most significant emotional change in the two emotion dimensions of pleasure and arousal on the basis of paired data features of physiological data when comparing two events. An experiment was set up where participants first viewed blocked IAPS picture presentations and then took part in a mental training wizard-of-oz scenario. Six meaningful features from four physiological channels of the IAPS picture presentation data - containing two electromyography channels (corrugator supercilii and zygomaticus major), skin conductance and peripheral blood volume - were extracted. Three pairs of features were found to contain valuable information about emotional changes when comparing two situations with different emotional contents. The method was then tested on a new blocked IAPS dataset and on the wizard-of-oz interaction scenario dataset to verify its performance. In 75% of the subjects, the detected emotion matched one of the two induced emotions. This information could be used by future companion technologies for modelling of the user's current emotional state in the two-dimensional emotion space of pleasure and arousal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call