Abstract
Sleep staging is the basis for assessing sleep quality. In the process of scoring each sleep stage, some automatic sleep staging models often fail to effectively capture the more accurate long-range correlation coupling between the input sleep EEG signals and the output sleep stage, which leads to the extracted features cannot effectively distinguish the different sleep stages. We propose an automatic end-to-end sleep stage classification method based on the original single-channel sleep EEG signal to perform feature learning on the source domain information of critical parts of the sleep EEG signals and solve the long-term time series problem. The method uses a convolutional neural network (CNN) to extract the time–frequency domain features of signals. It introduces a squeeze-and-excitation block (SE-Block) on CNN to enhance the feature representation ability of CNN. At the same time, a bidirectional recurrent unit (Bi-GRU) is used to learn the conversion rules of sleep stages, and an attention mechanism is added to the decoding part of Bi-GRU to enhance the long-term memory capacity of Bi-GRU and highlight the influence of essential features. According to the particularity of sleep signals, this method combines multiple models and techniques and creatively blends them to improve the performance of automatic staging. To validate the accuracy and stability of the model, the Fpz-Cz channel and the Pz-Oz channel EEG signals in the Sleep-EDF sleep dataset are used for 10-fold cross-validation. The classification accuracy was 88.48% and 87.56%, respectively. The results show that under the same model architecture and dataset, our model has a more vital ability to extract essential features, better representation ability, more stable performance, and a relatively simple model structure.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.