Abstract
In the healthcare domain, a transformative shift is envisioned towards Healthcare 5.0. It expands the operational boundaries of Healthcare 4.0 and leverages patient-centric digital wellness. Healthcare 5.0 focuses on real-time patient monitoring, ambient control and wellness, and privacy compliance through assisted technologies like artificial intelligence (AI), Internet-of-Things (IoT), big data, and assisted networking channels. However, healthcare operational procedures, verifiability of prediction models, resilience, and lack of ethical and regulatory frameworks are potential hindrances to the realization of Healthcare 5.0. Recently, explainable AI (EXAI) has been a disruptive trend in AI that focuses on the explainability of traditional AI models by leveraging the decision-making of the models and prediction outputs. The explainability factor opens new opportunities to the black-box models and brings confidence in healthcare stakeholders to interpret the machine learning (ML) and deep learning (DL) models. EXAI is focused on improving clinical health practices and brings transparency to the predictive analysis, which is crucial in the healthcare domain. Recent surveys on EXAI in healthcare have not significantly focused on the data analysis and interpretation of models, which lowers its practical deployment opportunities. Owing to the gap, the proposed survey explicitly details the requirements of EXAI in Healthcare 5.0, the operational and data collection process. Based on the review method and presented research questions, systematically, the article unfolds a proposed architecture that presents an EXAI ensemble on the computerized tomography (CT) image classification and segmentation process. A solution taxonomy of EXAI in Healthcare 5.0 is proposed, and operational challenges are presented. A supported case study on electrocardiogram (ECG) monitoring is presented that preserves the privacy of local models via federated learning (FL) and EXAI for metric validation. The case-study is supported through experimental validation. The analysis proves the efficacy of EXAI in health setups that envisions real-life model deployments in a wide range of clinical applications.
Full Text
Topics from this Paper
Explainable Artificial Intelligence
Healthcare Domain
Real-time Patient Monitoring
Artificial Intelligence
Disruptive Trend
+ Show 5 more
Create a personalized feed of these topics
Get StartedTalk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
Medical Physics
Dec 7, 2021
Ethical Theory and Moral Practice
May 26, 2023
Cluster Computing
Aug 17, 2022
Cluster computing
Aug 17, 2022
Clinical and experimental emergency medicine
Nov 28, 2023
SSRN Electronic Journal
Jan 1, 2021
Fertility and Sterility
Nov 1, 2020
BMC Medical Informatics and Decision Making
Feb 11, 2022
IEEE Access
Jan 1, 2022
The Lancet Digital Health
Oct 1, 2019
Jan 1, 2023
Fusion: Practice and Applications
Jan 1, 2021
IEEE Access
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023
IEEE Access
Jan 1, 2023