Abstract

Although several studies have utilized AI (artificial intelligence)-based solutions to enhance the decision making for mechanical ventilation, as well as, for mortality in COVID-19, the extraction of explainable predictors regarding heparin's effect in intensive care and mortality has been left unresolved. In the present study, we developed an explainable AI (XAI) workflow to shed light into predictors for admission in the intensive care unit (ICU), as well as, for mortality across those hospitalized COVID-19 patients who received heparin. AI empowered classifiers, such as, the hybrid Extreme gradient boosting (HXGBoost) with customized loss functions were trained on time-series curated clinical data to develop robust AI models. Shapley additive explanation analysis (SHAP) was conducted to determine the positive or negative impact of the predictors in the model's output. The HXGBoost predicted the risk for intensive care and mortality with 0.84 and 0.85 accuracy, respectively. SHAP analysis indicated that the low percentage of lymphocytes at day 7 along with increased FiO2 at days 1 and 5, low SatO2 at days 3 and 7 increase the probability for mortality and highlight the positive effect of heparin administration at the early days of hospitalization for reducing mortality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call