Abstract

Explainable artificial intelligence (XAI) has been instrumental in enabling the process of making informed decisions. The emergence of various supply chain (SC) platforms in modern times has altered the nature of SC interactions, resulting in a notable degree of uncertainty. This study aims to conduct a thorough analysis of the existing literature on decision support systems (DSSs) and their incorporation of XAI functionalities within the domain of SC. Our analysis has revealed the influence of XAI on the decision-making process in the field of SC. This study utilizes the SHapley Additive exPlanations (SHAP) technique to analysis the online data using Python machine learning (ML) process. Explanatory algorithms are specifically crafted to augment the lucidity of ML models by furnishing rationales for the prognostications they produce. The present study aims to establish measurable standards for identifying the constituents of XAI and DSSs that augment decision-making in the context of SC. This study assessed prior research with regards to their ability to make predictions, utilization of online dataset, number of variables examined, development of learning capability, and validation within the context of decision-making, emphasizes the research domains that necessitate additional exploration concerning intelligent decision-making under conditions of uncertainty.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call