Abstract

The main characteristic of depression is emotional dysfunction, manifested by increased levels of negative emotions and decreased levels of positive emotions. Therefore, accurate emotion recognition is an effective way to assess depression. Among the various signals used for emotion recognition, electroencephalogram (EEG) signal has attracted widespread attention due to its multiple advantages, such as rich spatiotemporal information in multi-channel EEG signals. First, we use filtering and Euclidean alignment for data preprocessing. In the feature extraction, we use short-time Fourier transform and Hilbert–Huang transform to extract time-frequency features, and convolutional neural networks to extract spatial features. Finally, bi-directional long short-term memory explored the timing relationship. Before performing the convolution operation, according to the unique topology of the EEG channel, the EEG features are converted into 3D tensors. This study has achieved good results on two emotion databases: SEED and Emotional BCI of 2020 WORLD ROBOT COMPETITION. We applied this method to the recognition of depression based on EEG and achieved a recognition rate of more than 70% under the five-fold cross-validation. In addition, the subject-independent protocol on SEED data has achieved a state-of-the-art recognition rate, which exceeds the existing research methods. We propose a novel EEG emotion recognition framework for depression detection, which provides a robust algorithm for real-time clinical depression detection based on EEG.

Highlights

  • The recognition of emotion is a major research direction of affective computing, which had been widely used to detect depression [1, 2]

  • support vector machine (SVM) [56] kernel principal component analysis (KPCA) [57] transfer component analysis (TCA) [58] transductive parameter transfer (TPT) [59] domain adversarial neural network (DANN) [60] dynamical graph convolutional neural network (DGCNN) [39] bi-hemispheres domain adversarial neural network (BiDANN) [61] BiDANN-S [41] R2G-STNN [62] instance-adaptive graph (IAG) [63] ours from the table that our method has achieved the highest accuracy and the smallest standard deviation

  • This study designed a complete pipeline from preprocessing to the classification of emotion recognition based on EEG, which achieved a correct rate of more than 80%

Read more

Summary

Introduction

The recognition of emotion is a major research direction of affective computing, which had been widely used to detect depression [1, 2]. EEG signals contain a large amount of information related to emotions and have the characteristics of high time resolution, and are not effortless to disguise [4–6], which. The activation of the amygdala seemed to be more related to negative emotions, and the relative activation of the right frontal lobe correlated with negative emotions (such as fear or disgust) [10]. The power of the alpha band and the asymmetry between the cerebral hemispheres relates to emotions [15–17], the changes in the gamma band connects with happiness and sadness, and the reduction of alpha waves on different sides of the temporal lobe correlates with joy and sorrow (left side is sad, happy on the right) [18, 19]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call