Background and objectiveIn everyday clinical practice, medical decision is currently based on clinical guidelines which are often static and rigid, and do not account for population variability, while individualized, patient-oriented decision and/or treatment are the paradigm change necessary to enter into the era of precision medicine. Most of the limitations of a guideline-based system could be overcome through the adoption of Clinical Decision Support Systems (CDSSs) based on Artificial Intelligence (AI) algorithms. However, the black-box nature of AI algorithms has hampered a large adoption of AI-based CDSSs in clinical practice. In this study, an innovative AI-based method to compress AI-based prediction models into explainable, model-agnostic, and reduced decision support systems (NEAR) with application to healthcare is presented and validated. MethodsNEAR is based on the Shapley Additive Explanations framework and can be applied to complex input models to obtain the contributions of each input feature to the output. Technically, the simplified NEAR models approximate contributions from input features using a custom library and merge them to determine the final output. Finally, NEAR estimates the confidence error associated with the single input feature contributing to the final score, making the result more interpretable. Here, NEAR is evaluated on a clinical real-world use case, the mortality prediction in patients who experienced Acute Coronary Syndrome (ACS), applying three different Machine Learning/Deep Learning models as implementation examples. ResultsNEAR, when applied to the ACS use case, exhibits performances like the ones of the AI-based model from which it is derived, as in the case of the Adaptive Boosting classifier, whose Area Under the Curve is not statistically different from the NEAR one, even the model's simplification. Moreover, NEAR comes with intrinsic explainability and modularity, as it can be tested on the developed web application platform (https://neardashboard.pythonanywhere.com/). ConclusionsAn explainable and reliable CDSS tailored to single-patient analysis has been developed. The proposed AI-based system has the potential to be used alongside the clinical guidelines currently employed in the medical setting making them more personalized and dynamic and assisting doctors in taking their everyday clinical decisions.