Abstract

The commercial availability of many real-life smart sensors, wearables, and mobile apps provides a valuable source of information about a wide range of human behavioral, physiological, and social markers that can be used to infer the user’s mental state and mood. However, there are currently no commercial digital products that integrate these psychosocial metrics with the real-time measurement of neural activity. In particular, electroencephalography (EEG) is a well-validated and highly sensitive neuroimaging method that yields robust markers of mood and affective processing, and has been widely used in mental health research for decades. The integration of wearable neuro-sensors into existing multimodal sensor arrays could hold great promise for deep digital neurophenotyping in the detection and personalized treatment of mood disorders. In this paper, we propose a multi-domain digital neurophenotyping model based on the socioecological model of health. The proposed model presents a holistic approach to digital mental health, leveraging recent neuroscientific advances, and could deliver highly personalized diagnoses and treatments. The technological and ethical challenges of this model are discussed.

Highlights

  • One of the most prominent characteristics of the current environment is high digital connectivity.This connectivity enables a moment-by-moment quantification of individual-level human phenotypes in situ using data from personal digital devices, both passively and actively

  • Applying and expanding the theoretical framework of the socioecological model, we argue that in order to increase the accuracy of mood trajectory detection, and to improve the personalization of digital treatment, a broader perspective of digital phenotyping must be employed

  • Recent advances in affective brain computing have increasingly demonstrated the power of deep learning methods to uncover complex patterns in neural activation that can sensitively distinguish between different affective states [34]. This new generation of affective brain–computer interface (BCI) technology can be implemented in wearable EEG systems to monitor emotions in real time while watching a video, listening to music, or experiencing virtual reality, and could be used to inform home-assisting technologies that provide feedback to the user [35,36]

Read more

Summary

Introduction

One of the most prominent characteristics of the current environment is high digital connectivity. The rapid growth of embedded smart sensors that are located in wearable technologies and mobile devices allows for the unobtrusive collection of behavioral (e.g., speech patterns), physiological (e.g., heart rate variability), and social activity (e.g., social media use) markers [2,3]. This integrated sensor-based data can be used for the early diagnosis and continuous monitoring of mental health conditions in real-time. Data from earlier crucial stages of psychological development rely exclusively on self-reports from parents and/or the individual later in life These significant limitations require a reconsideration of a better digital phenotyping model

Moving beyond the Individual to a Multi-Domain Neurophenotyping Model
Neural
Environmental
Life-Span
Findings
Challenges and Considerations
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call