Abstract

Sleep position monitoring is key when attempting to address posture triggered sleep disorders. Many studies have explored sleep posture detection from a dedicated physical sensing channel exploiting optimum body locations, such as the torso; or alternatively non-contact approaches. But, little work has been done to try to detect sleep position from a body location which, whilst being suboptimal for that purpose, does however allow for better extraction of more critical biomarkers from other sensing modalities, making possible multi-modal monitoring in certain clinical applications. This work presents two different approaches, at varying levels of complexity, for detecting 4 main sleep positions (supine, prone, lateral right and lateral left) from accelerometry data obtained by a single wearable device placed on the neck. An ultra light-weight threshold-based model is presented in this work, in addition to an Extra-Trees classifier. The threshold-based model was able to achieve 95% average accuracy and 0.89 F1-score on out-of-sample data, showing that it is possible to obtain a moderately high classification performance using a simple rule-based model. The ExtraTrees classifier, on the other hand, was able to achieve 99 % average accuracy and 0.99 average F1-score using only 25 base estimators with maximum depth of 20. Both models show promise in detecting sleep posture with high accuracy when collecting the signals from a neck-worn accelerometer sensor.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.