Abstract

Recently, electroencephalogram (EEG) signals have shown great potential to recognize human emotions. The goal of effective computing is to assist computers in understanding various types of emotions via human–computer interaction (HCI). Multichannel EEG signals are used to measure the electrical activity of the brain in space and time. Automated emotion recognition using multichannel EEG signals is an interesting area of cognitive neuroscience and affective computing research. This research proposes EEG multichannel rhythmic features and ensemble machine learning (EML) classifiers with leave-one-subject-out cross-validation (LOSOCV) for automatic emotion classification from multichannel EEG recordings. Multivariate fast iterative filtering (MvFIF) is used to assess the EEG rhythm sequences. EEG rhythms delta(δ), theta(θ), alpha(α), beta(β), and gamma(γ) are separated based on the mean frequency of the EEG rhythm sequence. Three Hjorth parameters and nine entropy features were extracted from multichannel EEG rhythms. Extracted features are selected using the minimum redundancy maximum relevance (mRMR) approach. The experimental design was performed on two emotional datasets (GAMEEMO and DREAMER). The validation showed that gamma rhythm multichannel features with EML-based subspace K-nearest neighbor (SS KNN) were as high as 93.5%–99.8%, achieving high classification accuracy. The comparisons of δ, θ, α, β, and γ rhythms with EML, support vector machine (SVM), and artificial neural network (ANN) were performed. we also analyzed multi-class emotions (HVHA, HVLA, LVHA, LVLA) with an ensemble-based bagging tree on gamma rhythm. It provides a novel solution for multichannel rhythm-specific features in EEG data analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call