Abstract

Emotional awareness perception is a largely growing field that allows for more natural interactions between people and machines. Electroencephalography (EEG) has emerged as a convenient way to measure and track a user’s emotional state. The non-linear characteristic of the EEG signal produces a high-dimensional feature vector resulting in high computational cost. In this paper, characteristics of multiple neural networks are combined using Deep Feature Clustering (DFC) to select high-quality attributes as opposed to traditional feature selection methods. The DFC method shortens the training time on the network by omitting unusable attributes. First, Empirical Mode Decomposition (EMD) is applied as a series of frequencies to decompose the raw EEG signal. The spatiotemporal component of the decomposed EEG signal is expressed as a two-dimensional spectrogram before the feature extraction process using Analytic Wavelet Transform (AWT). Four pre-trained Deep Neural Networks (DNN) are used to extract deep features. Dimensional reduction and feature selection are achieved utilising the differential entropy-based EEG channel selection and the DFC technique, which calculates a range of vocabularies using k-means clustering. The histogram characteristic is then determined from a series of visual vocabulary items. The classification performance of the SEED, DEAP and MAHNOB datasets combined with the capabilities of DFC show that the proposed method improves the performance of emotion recognition in short processing time and is more competitive than the latest emotion recognition methods.

Highlights

  • In recent years, much importance has been given to the recognition of human emotions usingElectroencephalographic (EEG) signals

  • The classification performance of the SJTU Emotion EEG dataset (SEED), DEAP and MAHNOB datasets combined with the capabilities of Deep Feature Clustering (DFC) show that the proposed method improves the performance of emotion recognition in short processing time and is more competitive than the latest emotion recognition methods

  • The Deep Neural Networks (DNN) used in this paper can perform better ranking for up to 1000 classes, while Support Vector Machine (SVM) can efficiently classify for a small number of classes

Read more

Summary

Introduction

Much importance has been given to the recognition of human emotions using. To avoid the dimensionality issue and computational overhead, one should only select features that currently represent the user’s emotional state To this end, much attention has been paid to recognizing emotions by converting one-dimensional EEG signals into two-dimensional spectral images. In emotion recognition and EEG-based classification, it is imperative to choose high-quality features because of computational overhead. Channel Selection (CS) and Deep Feature Clustering (DFC) algorithms for feature selection using combined features of multiple Deep Neural Networks (DNNs) to reduce feature vector without interfering with the overall classification performance, in order to take full advantage of Bag of Deep. We have presented techniques for resizing and selecting high-quality combined features of multiple neural networks using Differential Entropy based Channel Selection and.

Literature Review
Electrode-Channel Positioning
Dataset I
Dataset II
Dataset III
Methodology
Empirical Mode Decomposition
Analytic Wavelet Transform
Model 1
Model 2
Model 3
Model 4
Differential Entropy-Based Channel Selection
Deep Feature Clustering
Clustering
Histogram of Features
Classification
Results and Discussion
Support Vector Machine Classifier
Random Forest Classifier
Training a Network
Cost–Entropy Function
Traditional Methods
Mutual Information and Pearson Correlation
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.