Abstract

We present a sleep timing estimation approach that combines data-driven estimators with an expert model and uses smartphone context data. Our data-driven methodology comprises a classifier trained on features from smartphone sensors. Another classifier uses time as input. Expert knowledge is incorporated via the human circadian and homeostatic two process model. We investigate the two process model as output filter on classifier results and as fusion method to combine sensor and time classifiers. We analyse sleep timing estimation performance, in data from a two-week free-living study of 13 participants and sensor data simulations of arbitrary sleep schedules, amounting to 98280 nights. Five intuitive sleep parameters were derived to control the simulation. Moreover, we investigate model personalisation, by retraining classifiers based on participant feedback. The joint data and expert model yields an average relative estimation error of -2±62 min for sleep onset and -5±70 min for wake (absolute errors 40±48 min and 42±57 min, mean median absolute deviation 22 min and 15 min), which significantly outperforms data-driven methods. Moreover, the data and expert models combination remains robust under varying sleep schedules. Personalising data models with user feedback from the last two days showed the largest performance gain of 57% for sleep onset and 59% for wake up. Our power-efficient smartphone app makes convenient everyday sleep monitoring finally realistic.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.