Abstract

The classification problem for short time-window steady-state visual evoked potentials (SSVEPs) is important in practical applications because shorter time-window often means faster response speed. By combining the advantages of the local feature learning ability of convolutional neural network (CNN) and the feature importance distinguishing ability of attention mechanism, a novel network called AttentCNN is proposed to further improve the classification performance for short time-window SSVEP. Considering the frequency-domain features extracted from short time-window signals are not obvious, this network starts with the time-domain feature extraction module based on the filter bank (FB). The FB consists of four sixth-order Butterworth filters with different bandpass ranges. Then extracted multimodal features are aggregated together. The second major module is a set of residual squeeze and excitation blocks (RSEs) that has the ability to improve the quality of extracted features by learning the interdependence between features. The final major module is time-domain CNN (tCNN) that consists of four CNNs for further feature extraction and followed by a fully connected (FC) layer for output. Our designed networks are validated over two large public datasets, and necessary comparisons are given to verify the effectiveness and superiority of the proposed network. In the end, in order to demonstrate the application potential of the proposed strategy in the medical rehabilitation field, we design a novel five-finger bionic hand and connect it to our trained network to achieve the control of bionic hand by human brain signals directly. Our source codes are available on Github: https://github.com/JiannanChen/AggtCNN.git.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.