Abstract

Deep learning is widely used in the most recent automatic sleep scoring algorithms. Its popularity stems from its excellent performance and from its ability to process raw signals and to learn feature directly from the data. Most of the existing scoring algorithms exploit very computationally demanding architectures, due to their high number of training parameters, and process lengthy time sequences in input (up to 12 minutes). Only few of these architectures provide an estimate of the model uncertainty. In this study we propose DeepSleepNet-Lite, a simplified and lightweight scoring architecture, processing only 90-seconds EEG input sequences. We exploit, for the first time in sleep scoring, the Monte Carlo dropout technique to enhance the performance of the architecture and to also detect the uncertain instances. The evaluation is performed on a single-channel EEG Fpz-Cz from the open source Sleep-EDF expanded database. DeepSleepNet-Lite achieves slightly lower performance, if not on par, compared to the existing state-of-the-art architectures, in overall accuracy, macro F1-score and Cohen's kappa (on Sleep-EDF v1-2013 ±30mins: 84.0%, 78.0%, 0.78; on Sleep-EDF v2-2018 ±30mins: 80.3%, 75.2%, 0.73). Monte Carlo dropout enables the estimate of the uncertain predictions. By rejecting the uncertain instances, the model achieves higher performance on both versions of the database (on Sleep-EDF v1-2013 ±30mins: 86.1.0%, 79.6%, 0.81; on Sleep-EDF v2-2018 ±30mins: 82.3%, 76.7%, 0.76). Our lighter sleep scoring approach paves the way to the application of scoring algorithms for sleep analysis in real-time.

Highlights

  • G OOD sleep plays a crucial role in human well-being, and sleep disorders represent a significant and an increasing public health problem [1]

  • We demonstrate the efficiency of label smoothing and Monte Carlo dropout techniques in both calibrating and enhancing the performance of our model

  • We report the overall performance and the calibration measure of three different models, with and without Monte Carlo dropout at inference time, to which we refer w/o MC and w/ MC respectively

Read more

Summary

INTRODUCTION

G OOD sleep plays a crucial role in human well-being, and sleep disorders represent a significant and an increasing public health problem [1]. We propose DeepSleepNet-Lite, a simplified and lightweight automatic sleep scoring architecture. It provides the predicted sleep stages along with an estimate of their uncertainty. The two main contributions of this paper are: 1) the optimization of a simple feed-forward sleep scoring architecture, that processes only 90-second single-channel EEG in input; 2) the application of the Monte Carlo dropout sampling technique, using dropout at test time to capture the model uncertainty and to enhance the performance of the scoring system. We show that DeepSleepNet-Lite achieves performance on par with most up-to-date scoring systems

DEEPSLEEPNET-LITE
The Architecture
Training Algorithm
Regularization Techniques and Training Parameters
MODEL CALIBRATION
Conditional Probability Distribution in Label Smoothing
ESTIMATING UNCERTAINTY
Experiment Design
Analysis of Experiments
Comparison With State-of-the-Art
Comparison Among Our Methods
DISCUSSION
VIII. CONCLUSION AND FUTURE WORKS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call