Abstract

ObjectiveThis study aimed to improve the performance of sleep-stage classifiers by calculating not only electroencephalogram (EEG) patterns that appear in each sleep stage, but also the flow of information between brain regions. Methods: A continuous wavelet transform was used to determine brain activity, and a directed transfer function (DTF) was used to quantify the flow of information between brain regions. In addition, a multimodal architecture with a residual unit for improving the vanishing problem in deep learning and a residual attention network for localizing the information of input values were used to improve the efficiency and performance of sleep-stage classifiers. Results: The classification accuracy was 87.34%, F1-score was 87.42, and Cohen’s kappa was 0.83. Further, the DTF values were analyzed using various approaches to identify brain connectivity based on the sleep stages. Conclusion: The accuracy, F1-score, and Cohen’s kappa values confirmed a significant performance improvement compared with those of previously proposed sleep-stage classifiers. The results indicated that there were parts with significant differences in information flow based on the sleep stage, and that the DTF influenced the performance of sleep-stage classifiers by accurately reflecting the sleep stages. Significance: In previous studies, deep and machine learning were used to construct classifiers by utilizing significant changes in brain waves measured during sleep states. However, this study aimed to classify sleep stages more accurately by incorporating the flow of information between brain regions. The proposed classifier overcame these limitations and achieved superior results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call