Abstract

BackgroundRecent studies have demonstrated that passive smartphone and wearable sensor data collected throughout daily life can predict anxiety symptoms cross-sectionally. However, to date, no research has demonstrated the capacity for these digital biomarkers to predict long-term prognosis. MethodsWe utilized deep learning models based on wearable sensor technology to predict long-term (17–18-year) deterioration in generalized anxiety disorder and panic disorder symptoms from actigraphy data on daytime movement and nighttime sleeping patterns. As part of Midlife in the United States (MIDUS), a national longitudinal study of health and well-being, subjects (N = 265) (i) completed a phone-based interview that assessed generalized anxiety disorder and panic disorder symptoms at enrollment, (ii) participated in a one-week actigraphy study 9–14 years later, and (iii) completed a long-term follow-up, phone-based interview to quantify generalized anxiety disorder and panic disorder symptoms 17–18 years from initial enrollment. A deep auto-encoder paired with a multi-layered ensemble deep learning model was leveraged to predict whether participants experienced increased anxiety disorder symptoms across this 17–18 year period. ResultsOut-of-sample cross-validated results suggested that wearable movement data could significantly predict which individuals would experience symptom deterioration (AUC = 0.696, CI [0.598, 0.793], 84.6% sensitivity, 52.7% specificity, balanced accuracy = 68.7%). ConclusionsPassive wearable actigraphy data could be utilized to predict long-term deterioration of anxiety disorder symptoms. Future studies should examine whether these methods could be implemented to prevent deterioration of anxiety disorder symptoms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.