Abstract

Emotion plays a predominant role in external situations or events in daily life. Different emotions display different connectivity patterns through related information processing. Electroencephalography (EEG)-based emotional recognition is a controversial subject in the field of affective computing. However, EEG recordings are mixed-signals and cannot show the exact information about active sources from different emotional states. In this paper, we propose a method for emotion discrimination based on the source connectivity method. Features are extracted as connectivity patterns in different frequency bands based on emotion-based reconstructed EEG sources using sLORETA. In order to identify most related brain regions to emotions, we identify data-driven spatially compact regions, called regions of interest (ROIs), based on the reconstructed neural activity. Also, to estimates ROI time series with the intrinsic non-stationarity of neural activity, an iterative dynamic approach is used. The method explicitly contains a dynamic constraint, considering the neural activity evolution for advance connectivity analysis. Throughout this study, we consider three connectivity measures widely applied to emotion recordings including, iCoh, PLV, and WPLI. In the next step, the connectivity patterns are used as features to discriminate emotion states by training an SVM classifier. The performance of the proposed method is assessed over a real high-resolution emotional database. This study reveals that the proposed method can identify meaningful connection features between main emotion-related brain regions, leading to higher interpretability and accuracy. Our results demonstrate that the ROI-wise iCoh features enhance the average of accuracy up to 83.84% in comparison with raw EEG features (71.70%).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call