Abstract

Studying brain activity and deciphering the information in electroencephalogram (EEG) signals has become an emerging research field, and substantial advances have been made in the EEG-based classification of emotions. However, using different EEG features and complementarity to discriminate other emotions is still challenging. Most existing models extract a single temporal feature from the EEG signal while ignoring the crucial temporal dynamic information, which, to a certain extent, constrains the classification capability of the model. To address this issue, we propose an Attention-Based Depthwise Parameterized Convolutional Gated Recurrent Unit (AB-DPCGRU) model and validate it with the mixed experiment on the SEED and SEED-IV datasets. The experimental outcomes revealed that the accuracy of the model outperforms the existing state-of-the-art methods, which confirmed the superiority of our approach over currently popular emotion recognition models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call