Abstract
Sleep staging is a vital process for evaluating sleep quality and diagnosing sleep-related diseases. Most of the existing automatic sleep staging methods focus on time-domain information and often ignore the transformation relationship between sleep stages. To deal with the above problems, we propose a Temporal-Spectral fused and Attention-based deep neural Network model (TSA-Net) for automatic sleep staging, using a single-channel electroencephalogram (EEG) signal. The TSA-Net is composed of a two-stream feature extractor, feature context learning, and conditional random field (CRF). Specifically, the two-stream feature extractor module can automatically extract and fuse EEG features from time and frequency domains, considering that both temporal and spectral features can provide abundant distinguishing information for sleep staging. Subsequently, the feature context learning module learns the dependencies between features using the multi-head self-attention mechanism and outputs a preliminary sleep stage. Finally, the CRF module further applies transition rules to improve classification performance. We evaluate our model on two public datasets, Sleep-EDF-20 and Sleep-EDF-78. In terms of accuracy, the TSA-Net achieves 86.64% and 82.21% on the Fpz-Cz channel, respectively. The experimental results illustrate that our TSA-Net can optimize the performance of sleep staging and achieve better staging performance than state-of-the-art methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Neural Systems and Rehabilitation Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.