Abstract

Currently, emotions recognition has been attracting a lot of interest for researchers in various field, so as in the study of human-computer interaction (HCI). One of an interesting issue in HCI emotions study is the use of physiological signals, such as electrocardiograph (ECG), blood vessel pressure (BVP), electroencephalograph (EEG), and any others signals to recognize emotions. Among all physiological signals, EEG is known to be the most reliable modality to understand emotions processing and perceptions. Therefore, this study observed emotions recognition through EEG signals by investigating emotions cue from time domain features extraction for differentiating two class of emotions, namely, happy and sad. We developed an EEG based emotion dataset from 12 participants with 4 recording channels of EEG cap, i.e., AF3, AF4, O1 and O2. The time domain features of mean, standard deviation and number of peaks were extracted from alpha and beta frequency bands. For the recognition, we train the features set into Naive Bayes learning classifier. From the results, it was shown that feature of mean gives the highest contribution to the classification. Moreover, from the observation of frequency bands, the combination of alpha and beta bands tend to provide better accuracy in emotions recognition rather than using alpha or beta frequency alone. The highest classification result of Naive Bayes reached 87.5% accuracy of emotions recognition with 66% split testing option.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call