Background: Iron deficiency (ID) is estimated to affect between 9 and 16% of adolescent girls in the United States. It occurs during a critical time of increased iron demands related to blood volume expansion, muscle mass increase, and menstrual blood loss. Black girls are disproportionately affected, likely due to social determinants of health. ID is associated with decreased cognitive function, exercise capacity, and fatigue. Unrecognized ID can progress to complications of severe anemia. Despite readily available therapies including both oral and intravenous iron, no universal screening recommendations or evidence-based prediction tools exist to identify adolescent girls at risk for ID. Our objective was to determine the prevalence of, timing of, and clinical risk factors for ID in a large, biracial cohort of adolescent girls. Methods: Existing clinical data and stored serum specimens were obtained from the National Heart Lung and Blood Institute (NHLBI) Growth and Health Study (NGHS). The NGHS was a 10-year longitudinal study initiated in 1985 that enrolled 2,379 girls (51% Black, 49% White) from three cities (San Francisco, Cincinnati, Washington DC). Subjects participated in annual visits with serum samples obtained at Years 1, 3, 5, 7, and 10 with a retention rate of 89% at the 10th annual visit. Clinical data included socioeconomic data, body mass index (BMI), age at menarche, menstrual and pregnancy history, use of oral contraceptive medications, and dietary iron intake. We hypothesized that >10% of NGHS subjects would meet criteria for ID. We further hypothesized that longer duration from onset of menarche, low dietary iron intake, history of pregnancy, and obesity (defined as BMI >30) would be associated with ID. Available serum samples at Year 10 (n=692) were analyzed with Q-Plex Human Micronutrient Array ELISA kits (Quansys Biosciences) to measure iron-related parameters including serum ferritin, soluble transferrin receptor (sTfR1), sTfR1:ferritin index, and C-reactive protein. Subjects with ferritin >150 ng/mL and CRP >5 mg/L were excluded. ID was defined as serum ferritin <15 ng/mL, sTfR1 >5 mg/L, and sTfR1:ferritin index >2. Univariate logistic regression models assessed for the association of ID with established clinical risk factors. Results: Clinical and laboratory data from 510 subjects were included in the final analysis (median age 19 years, 51.4% Black). Of those, 83 (n=16.3%) were found to be iron deficient. ID was not associated with racial group, poverty threshold, BMI, age at menarche, history of pregnancy, oral contraceptive pill use, or average daily iron intake (TABLE). Subjects with later onset of menarche tended to have lower ferritin (OR 1.27 [1.05-1.54], p<0.015, FIGURE). Conclusion: The prevalence of ID in subjects 18 to 19 years of age enrolled in the NGHS was consistent with higher values of previously published prevalence estimates in adolescent girls. Established risk factors for the development of ID, such as earlier onset of menarche, were not clearly associated with ID in this cohort. These findings suggest that universal screening, rather than targeted screening, may be important for the early identification of all girls with ID. Additional clinical data and serum samples from Years 5 and 7 of the NGHS will be obtained to determine the prevalence of ID in the same cohort of subjects at earlier ages and determine changes in iron status over time. The results from this study will inform the development of a risk prediction tool as well as evidenced-based screening recommendations for ID in adolescent girls. Figure 1View largeDownload PPTFigure 1View largeDownload PPT Close modal
Read full abstract