Abstract
BackgroundIn the last decade, a lot of attention has been given to develop artificial intelligence (AI) solutions for mental health using machine learning. To build trust in AI applications, it is crucial for AI systems to provide for practitioners and patients the reasons behind the AI decisions. This is referred to as Explainable AI. While there has been significant progress in developing stress prediction models, little work has been done to develop explainable AI for mental health.MethodsIn this work, we address this gap by designing an explanatory AI report for stress prediction from wearable sensors. Because medical practitioners and patients are likely to be familiar with blood test reports, we modeled the look and feel of the explanatory AI on those of a standard blood test report. The report includes stress prediction and the physiological signals related to stressful episodes. In addition to the new design for explaining AI in mental health, the work includes the following contributions: Methods to automatically generate different components of the report, an approach for evaluating and validating the accuracies of the explanations, and a collection of ground truth of relationships between physiological measurements and stress prediction.ResultsTest results showed that the explanations were consistent with ground truth. The reference intervals for stress versus non-stress were quite distinctive with little variation. In addition to the quantitative evaluations, a qualitative survey, conducted by three expert psychiatrists confirmed the usefulness of the explanation report in understanding the different aspects of the AI system.ConclusionIn this work, we have provided a new design for explainable AI used in stress prediction based on physiological measurements. Based on the report, users and medical practitioners can determine what biological features have the most impact on the prediction of stress in addition to any health-related abnormalities. The effectiveness of the explainable AI report was evaluated using a quantitative and a qualitative assessment. The stress prediction accuracy was shown to be comparable to state-of-the-art. The contributions of each physiological signal to the stress prediction was shown to correlate with ground truth. In addition to these quantitative evaluations, a qualitative survey with psychiatrists confirmed the confidence and effectiveness of the explanation report in the stress made by the AI system. Future work includes the addition of more explanatory features related to other emotional states of the patient, such as sadness, relaxation, anxiousness, or happiness.
Highlights
In the last decade, a lot of attention has been given to develop artificial intelligence (AI) solutions for mental health using machine learning
To address the lack of explainable AI systems for stress prediction, we propose a new design for an explainable AI system that predicts stress using data from wearable devices
IMPACTTo assess the accuracy of the IMPACT values, we examined the correlations between the IMPACT values and other stress indicators obtained from studies that examined what physiological measurements are affected by stress
Summary
A lot of attention has been given to develop artificial intelligence (AI) solutions for mental health using machine learning. To build trust in AI applications, it is crucial for AI systems to provide for prac‐ titioners and patients the reasons behind the AI decisions. This is referred to as Explainable AI. We are interested in developing an AI-based stress prediction model that automatically produces a report explaining the results of the AI evaluation in a way that is understandable and useful to human users. Understanding the reasons behind AI models’ predictions has become so crucial that the European Union developed new data privacy rules in 2018, where companies that use AI are obliged to provide either detailed explanations of individual AI algorithms or general information about how the algorithms make decisions when working with personal data [5]
Full Text
Topics from this Paper
Artificial Intelligence
Explainable Artificial Intelligence
Stress Prediction
Design For Artificial Intelligence
Physiological Measurements
+ Show 5 more
Create a personalized feed of these topics
Get StartedTalk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
Fertility and Sterility
Nov 1, 2020
Medical Physics
Dec 7, 2021
Cell
Jun 1, 2020
Physica Medica
Mar 1, 2021
Patterns
Feb 1, 2022
Patterns
Nov 1, 2021
AI & SOCIETY
Jan 26, 2023
Oct 30, 2020
International Journal of Human-Computer Studies
Jun 1, 2022
The Lancet Digital Health
Oct 1, 2019
SSRN Electronic Journal
Jan 1, 2021
Artificial Intelligence Review
Jun 22, 2023
The Lancet Digital Health
May 1, 2019
Information Sciences
Nov 1, 2022
BMC Medical Informatics and Decision Making
BMC Medical Informatics and Decision Making
Nov 23, 2023
BMC Medical Informatics and Decision Making
Nov 21, 2023
BMC Medical Informatics and Decision Making
Nov 21, 2023
BMC Medical Informatics and Decision Making
Nov 20, 2023
BMC Medical Informatics and Decision Making
Nov 17, 2023
BMC Medical Informatics and Decision Making
Nov 17, 2023
BMC Medical Informatics and Decision Making
Nov 16, 2023
BMC Medical Informatics and Decision Making
Nov 16, 2023
BMC Medical Informatics and Decision Making
Nov 16, 2023
BMC Medical Informatics and Decision Making
Nov 16, 2023