Abstract

Electroencephalogram (EEG) plays an important role in studying brain function and human cognitive performance, and the recognition of EEG signals is vital to develop an automatic sleep staging system. However, due to the complex nonstationary characteristics and the individual difference between subjects, how to obtain the effective signal features of the EEG for practical application is still a challenging task. In this article, we investigate the EEG feature learning problem and propose a novel temporal feature learning method based on amplitude-time dual-view fusion for automatic sleep staging. First, we explore the feature extraction ability of convolutional neural networks for the EEG signal from the perspective of interpretability and construct two new representation signals for the raw EEG from the views of amplitude and time. Then, we extract the amplitude-time signal features that reflect the transformation between different sleep stages from the obtained representation signals by using conventional 1-D CNNs. Furthermore, a hybrid dilation convolution module is used to learn the long-term temporal dependency features of EEG signals, which can overcome the shortcoming that the small-scale convolution kernel can only learn the local signal variation information. Finally, we conduct attention-based feature fusion for the learned dual-view signal features to further improve sleep staging performance. To evaluate the performance of the proposed method, we test 30-s-epoch EEG signal samples for healthy subjects and subjects with mild sleep disorders. The experimental results from the most commonly used datasets show that the proposed method has better sleep staging performance and has the potential for the development and application of an EEG-based automatic sleep staging system.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.