Abstract

Information about a user's emotional state is a very important aspect of affective interaction with embodied conversational agents. Most research work aims at identifying emotions through speech or facial expressions. However, facial expressions and speech are not continuously available. Furthermore, in some cases, bio-signal data are also required in order to fully assess a user's emotional state. We aimed to recognize the six, basic, primary emotions proposed by Ekman, using a widely-available and low-cost brain-computer interface (BCI) and a biofeedback sensor that measures heart rate. We exposed participants to sets of 10 IAPS images that had been partially validated through a subjective rating protocol. Results showed that the collected signals allowed us identifying user's emotional state. In addition, a partial correlation between objective and subjective data can be observed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call