Abstract

Amyotrophic Lateral Sclerosis (ALS), also known as Motor Neuron Disease (MND), is a rare and fatal neurodegenerative disease. As ALS is currently incurable, the aim of the treatment is mainly to alleviate symptoms and improve quality of life (QoL). We designed a prototype Clinical Decision Support System (CDSS) to alert clinicians when a person with ALS is experiencing low QoL in order to inform and personalise the support they receive. Explainability is important for the success of a CDSS and its acceptance by healthcare professionals. The aim of this work isto announce our prototype (C-ALS), supported by a first short evaluation of its explainability. Given the lack of similar studies and systems, this work is a valid proof-of-concept that will lead to future work. We developed a CDSS that was evaluated by members of the team of healthcare professionals that provide care to people with ALS in the ALS/MND Multidisciplinary Clinic in Dublin, Ireland. We conducted a user study where participants were asked to review the CDSS and complete a short survey with a focus on explainability. Healthcare professionals demonstrated some uncertainty in understanding the system’s output. Based on their feedback, we altered the explanation provided in the updated version of our CDSS. C-ALS provides local explanations of its predictions in a post-hoc manner, using SHAP (SHapley Additive exPlanations). The CDSS predicts the risk of low QoL in the form of a probability, a bar plot shows the feature importance for the specific prediction, along with some verbal guidelines on how to interpret the results. Additionally, we provide the option of a global explanation of the system’s function in the form of a bar plot showing the average importance of each feature. C-ALS is available online for academic use.

Highlights

  • UCD School of Computer Science, University College Dublin, Dublin 4, Ireland; FutureNeuro SFI Research Centre, RCSI University of Medicine and Health Sciences, Dublin 2, Ireland; Academic Unit of Neurology, Trinity Biomedical Sciences Institute, Trinity College Dublin, Dublin 2, Ireland; Department of Neurology, National Neuroscience Centre, Beaumont Hospital, Dublin 9, Ireland

  • Almost all respondents said they would not use a Clinical Decision Support System (CDSS) that did not provide any explanations, while one person said they might use it if they knew that the system was correct 80% of the time

  • Trust towards CDSS comes from both accuracy and explanations; some clinicians might be willing to accept lack of explanations for high accuracy, or lack of accuracy that is followed by explanations, the general conclusion from this and other studies is that they are both crucial for clinician trust and acceptance, especially when clinicians have no prior experience with CDSS

Read more

Summary

Introduction

Explainability is important for the success of a CDSS and its acceptance by healthcare professionals. The aim of this work isto announce our prototype (C-ALS), supported by a first short evaluation of its explainability. We developed a CDSS that was evaluated by members of the team of healthcare professionals that provide care to people with ALS in the ALS/MND Multidisciplinary Clinic in Dublin, Ireland. Healthcare professionals demonstrated some uncertainty in understanding the system’s output. Based on their feedback, we altered the explanation provided in the updated version of our CDSS. We provide the option of a global explanation of the system’s function in the form of a bar plot showing the average importance of each feature. Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call