Abstract

Sleep screening is an important tool for both healthcare and neuroscientific research. Automatic sleep scoring is an alternative to the time-consuming gold-standard manual scoring procedure. Recently there have seen promising results on automatic stage scoring by extracting spatio-temporal features via deep neural networks from electroencephalogram (EEG). However, such methods fail to consistently yield good performance due to a missing piece in data representation: the medical criterion of the sleep scoring task on top of EEG features. We argue that capturing stage-specific features that satisfy the criterion of sleep medicine is non-trivial for automatic sleep scoring. This paper considers two criteria: Transient stage marker and Overall profile of EEG features, then we propose a physiologically meaningful framework for sleep stage scoring via mixed deep neural networks. The framework consists of two sub-networks: feature extraction networks, constructed in consideration of the physiological characteristics of sleep, and an attention-based scoring decision network. Moreover, we quantize the framework for potential use under an IoT setting. For proof-of-concept, the performance of the proposed framework is demonstrated by introducing multiple sleep datasets with the largest comprising 42,560h recorded from 5,793 subjects. From the experiment results, the proposed method achieves a competitive stage scoring performance, especially for Wake, N2, and N3, with higher F1 scores of 0.92, 0.86, and 0.88, respectively. Moreover, the feasibility analysis of framework quantization provides a potential for future implementation in the edge computing field and clinical settings.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.