Abstract

Convolutional Neural Networks (CNN) have recently made considerable advances in the field of biomedical signal processing. These methodologies can assist in emotion recognition for affective brain computer interface. In this paper, a novel emotion recognition system based on the effective connectivity and the fine-tuned CNNs from multichannel Electroencephalogram (EEG) signal is presented. After preprocessing EEG signals, the relationships among 32 channels of EEG in the form of effective brain connectivity analysis which represents information flow between regions are computed by direct Directed Transfer Function (dDTF) method which yields a 32*32 image. Then, these constructed images from EEG signals for each subject were fed as input to four versions of pre-trained CNN models, AlexNet, ResNet-50, Inception-v3 and VGG-19 and the parameters of these models are fine-tuned, independently. The proposed deep learning architectures automatically learn patterns in the constructed image of the EEG signals in frequency bands. The efficiency of the proposed approach is evaluated on MAHNOB-HCI and DEAP databases. The experiments for classifying five emotional states show that the ResNet-50 applied on dDTF images in alpha band achieves best results due to specific architecture which captures the brain connectivity, efficiently. The accuracy and F1-score values for MAHNOB-HCI were obtained 99.41, 99.42 and for DEAP databases, 98.17, and 98.23. Newly proposed model is capable of effectively analyzing the brain function using information flow from multichannel EEG signals using effective connectivity measure of dDTF and ResNet-50.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call