Abstract

Sleep staging is the basis of sleep evaluation and a key step in the diagnosis of sleep-related diseases. Despite being useful, the existing sleep staging methods have several disadvantages, such as relying on artificial feature extraction, failing to recognize temporal sequence patterns in the long-term associated data, and reaching the accuracy upper limit of sleep staging. Hence, this paper proposes an automatic Electroencephalogram (EEG) sleep signal staging model, which based on Multi-scale Attention Residual Nets (MAResnet) and Bidirectional Gated Recurrent Unit (BiGRU). The proposed model is based on the residual neural network in deep learning. Compared with the traditional residual learning module, the proposed model additionally uses the improved channel and spatial feature attention units and convolution kernels of different sizes in parallel at the same position. Thus, multiscale feature extraction of the EEG sleep signals and residual learning of the neural networks is performed to avoid network degradation. Finally, BiGRU is used to determine the dependence between the sleep stages and to realize the automatic learning of sleep data staging features and sleep cycle extraction. According to the experiment, the classification accuracy and kappa coefficient of the proposed method on sleep-EDF data set are 84.24% and 0.78, which are respectively 0.24% and 0.21 higher than the traditional residual net. At the same time, this paper also verified the proposed method on UCD and SHHS data sets, and the figure of classification accuracy is 79.34% and 81.6%, respectively. Compared to related existing studies, the recognition accuracy is significantly improved, which validates the effectiveness and generalization performance of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.