Abstract
As one of the most important research fields in the brain-computer interface (BCI) field, electroencephalogram (EEG) classification has a wide range of application values. However, for the EEG signal, it is difficult for the traditional neural networks to capture the characteristics of the EEG signal more comprehensively from the time and space dimensions, which has a certain effect on the accuracy of EEG classification. To solve this problem, we can improve the accuracy of classification via end-to-end learning of the time and space dimensions of EEG. In this paper, a new type of EEG classification network, the separable EEGNet (S-EEGNet), is proposed based on Hilbert-Huang transform (HHT) and a separable convolutional neural network (CNN) with bilinear interpolation. The EEG signal is transformed into time-frequency representation by HHT, which allows the EEG signal to be better described in the frequency domain. Then, the depthwise and pointwise elements of the network are combined to extract the feature map. The displacement variable is added by the bilinear interpolation method to the convolution layer of the separable CNN, allowing the free deformation of the sampling grid. The deformation depends on the local, dense, and adaptive input characteristics of the EEG data. The network can learn from the time and space dimensions of EEG signals end to end to extract features to improve the accuracy of EEG classification. To show the effectiveness of S-EEGNet, the team used this method to test two different types of EEG public datasets (motor imagery classification and emotion classification). The accuracy of motor imagery classification is 77.9%, and the accuracy of emotion classification is 89.91%, and 88.31%, respectively. The experimental results showed that the classification accuracy of S-EEGNet improved by 3.6%, 1.15%, and 1.33%, respectively.
Highlights
With the development of human–computer interaction (HCI) [1] and the brain–computer interface (BCI) [2], the application potential of hot technologies in the field of HCI technology has begun to emerge
The team compared the multiscale filter bank CNN (MSFBCNN) with the latest methods on the open dataset, and the results showed that the accuracy of this method in the topic classification was higher than the baseline
We introduce Hilbert–Huang transform (HHT) and depthwise separable convolution, add a displacement variable via the bilinear interpolation method, describe the architecture of the S-EEGNet model proposed by our team, and provide a detailed formula of the S-EEGNet model
Summary
With the development of human–computer interaction (HCI) [1] and the brain–computer interface (BCI) [2], the application potential of hot technologies in the field of HCI technology has begun to emerge. EEG is a pattern obtained by amplifying and recording the spontaneous biopotential of the brain. From the scalp with precise electronic instruments. This represents the spontaneous and rhythmic electrical activity of the brain cell group recorded by the electrodes. Due to the characteristics of the dynamic time series data of the EEG signal, each observation value in the EEG sequence is the comprehensive result of various factors affecting the change simultaneously. The real change of EEG signal is the superposition or a combination of several changes, which leads to the correlation and mutual restriction between EEG sequences. To solve the problem that the traditional neural network unilaterally analyzes a single EEG sequence, which leads to difficulty
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.