Although iron deficiency anemia is common, interpreting iron laboratory test results can be challenging in patients with comorbidities. We aimed to study the accuracy of common iron biomarkers compared with bone marrow iron staining in a large retrospective dataset of hematological patients. We collected from 6610 patients (median age 66 years) results of iron staining, with their concurrent ferritin, transferrin saturation, soluble transferrin receptor, transferrin, hemoglobin, and mean red blood cell volume results from Helsinki University Hospital electronic health records. In receiver operating characteristics (ROC) analysis, ferritin had the highest area under curve (AUC) with 88% (95% CI 86-90%) for females and 89% (87-91%) for males in predicting reduced bone marrow iron. Using a ferritin cut-off of 30 µg/L resulted in high specificity rates of 97% in females and 99% in males. However, sensitivity rates were only 54% and 35%, respectively. Other studied biomarkers had inferior AUCs. Multivariate logistic regression models did not significantly perform better in prediction compared to ferritin alone. With 50% pre-probability for reduced iron stores, a ferritin of 30 µg/L (females) and 51 µg/L (males) had 95% positive predictive value for reduced iron stores. A 95% negative predictive value was achieved at 1750 µg/L (females) and 4967 µg/L (males). In our large population study, ferritin was the best single biomarker for iron deficiency in secondary care. Adding other blood tests in a multivariate model did not improve performance. However, in these hematological patients, even a high ferritin did not rule out iron deficiency with 95% certainty.
Read full abstract