Abstract

Nowadays, many sleep staging algorithms have not been widely used in practical situations due to the lack of persuasiveness of generalization outside the given datasets. Thus, to improve generalization, we select seven highly heterogeneous datasets covering 9970 records with over 20k hours among 7226 subjects spanning 950 days for training, validation, and evaluation. In this paper, we propose an automatic sleep staging architecture called TinyUStaging using single-lead EEG and EOG. The TinyUStaging is a lightweight U-Net with multiple attention modules to perform adaptive recalibration of the features, including Channel and Spatial Joint Attention (CSJA) block and Squeeze and Excitation (SE) block. Noteworthily, to address the class imbalance problem, we design sampling strategies with probability compensation and propose a class-aware Sparse Weighted Dice and Focal (SWDF) loss function to improve the recognition rate for minority classes (N1) and hard-to-classify samples (N3) especially for OSA patients. Additionally, two hold-out sets containing healthy and sleep-disordered subjects are considered to verify the generalization. Facing the background of large-scale imbalanced heterogeneous data, we perform subject-wise 5-fold cross-validation on each dataset, and the results demonstrate that our model outperforms many methods, especially in N1, achieving an average overall accuracy, macro F1-score (MF1), and kappa of 84.62%, 79.6%, and 0.764 on heterogeneous datasets under optimal partitioning, providing a solid foundation for out-of-hospital sleep monitoring. Moreover, the overall standard deviation of MF1 under different folds remains within 0.175, indicating that the model is relatively stable.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.