Abstract
AbstractSleep apnea is one of the most common sleep disorders. The consequences of undiagnosed sleep apnea can be very serious, increasing the risk of high blood pressure, heart disease, stroke, and Alzheimer’s disease over a long period of time. However, many people are often unaware of their condition. The gold standard for diagnosing sleep apnea is nighttime polysomnography monitoring in a specialized sleep laboratory. However, these diagnoses are expensive and the number of beds is limited, and there is insufficient monitoring in terms of time dimension. Existing methods for automated detection use no more than three physiological signals, but all other signals are also associated with the patient’s sleep. In addition, the limited amount of medical real annotation data, especially abnormal samples, lead to weak model generalization capability. The gap between model generalization capability and medical field needs still exists. In this paper, we propose a method for integrating medical interpretation rules into a long short-term memory neural network based on self-attention with multichannel respiratory signals as input. We obtain attention weights through a token-level attention mechanism and then extract key rules of medical interpretation to assist the weights, improving model generalization and reducing the dependence on data volume. Compared with the best prediction performance of existing methods, the average improvements of our method in accuracy, precision, and f1-score are 3.26%, 7.03%, and 1.78%, respectively. The algorithm tested the performance of our model on the Sleep Heart Health Study data set and found that the model outperformed existing methods and could help physicians make decisions in their practices.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.