Abstract

Patients with β-thalassemia major (β-TM) face a wide range of complications as a result of excess iron in vital organs, including the heart and liver. Our aim was to find the best predictive machine learning (ML) model for assessing heart and liver iron overload in patients with β-TM. Data from 624 β-TM patients were entered into three ML models using random forest (RF), gradient boost model (GBM), and logistic regression (LR). The data were classified and analyzed by R software. Four evaluation metrics of predictive performance were measured: sensitivity, specificity, accuracy, and area under the curve (AUC), operating characteristic curve. For heart iron overload, the LR had the highest predictive performance based on AUC: 0.68 [95% CI (95% confidence interval): 0.60, 0.75]. The GBM also had the highest specificity (69.0%) and accuracy (67.0%). Most sensitivity is also acquired with LR (75.0%). For liver iron overload, the highest performance based on AUC was observed with RF, AUC: 0.68 (95% CI: 0.59, 0.76). The RF showed the highest accuracy (66.0%) and specificity (66.0%), while the LR had the highest sensitivity (84.0%). Ferritin, duration of transfusion, and age were determined as the most effective predictors of iron overload in both heart and liver. Logistic regression LR was determined to be the strongest method to predict cardiac and RF values for liver iron overload in patients with β-TM. Older thalassemia patients with a high serum ferritin (SF) level and a longer duration of transfusion therapy were more prone to heart and liver iron overload.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call