Abstract

Portable sleep monitoring devices with less-attached sensors and high-accuracy sleep staging methods can expedite sleep disorder diagnosis. The aim of this study was to propose a single-channel EEG sleep staging model, SleepStageNet, which extracts sleep EEG features by multi-scale convolutional neural networks (CNN) and then infers the type of sleep stages by capturing the contextual information between adjacent epochs using recurrent neural networks (RNN) and conditional random field (CRF). To verify the feasibility of our model, two datasets, one composed by two different single-channel EEGs (Fpz-Cz and Pz-Oz) on 20 healthy people and one composed by a single-channel EEG (F4-M1) on 104 obstructive sleep apnea (OSA) patients with different severities, were examined. The corresponding sleep stages were scored as four states (wake, REM, light sleep, and deep sleep). The accuracy measures were obtained from epoch-by-epoch comparison between the model and PSG scorer, and the agreement between them was quantified with Cohen's kappa (ҡ). Our model achieved superior performance with average accuracy (Fpz-Cz, 0.88; Pz-Oz, 0.85) and ҡ (Fpz-Cz, 0.82; Pz-Oz, 0.77) on the healthy people. Furthermore, we validated this model on the OSA patients with average accuracy (F4-M1, 0.80) and ҡ (F4-M1, 0.67). Our model significantly improved the accuracy and ҡ compared to previous methods. The proposed SleepStageNet has proved feasible for assessment of sleep architecture among OSA patients using single-channel EEG. We suggest that this technological advancement could augment the current use of home sleep apnea testing.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.