IntroductionIron products are widely available over the counter and have the potential to cause serious toxicity. Iron concentrations can be used to prognosticate and guide treatment during acute ingestions. Traditionally, a concentration of 350 μg/dL with symptoms, or 500 μg/dL without symptoms, is considered toxic and will likely need treatment to prevent decompensation. It is generally recommended that an iron concentration is obtained at least 4 h after exposure to provide adequate absorption time and avoid falsely low iron concentrations. Despite this, many iron overdoses have concentrations drawn immediately upon patient presentation. The utility of an iron concentration drawn before 4 h in assessing exposure risk is not clear. The purpose of this study is to determine if patients' symptoms and iron concentrations obtained between 2 and 4 h can predict the development of iron concentrations after 4 h. MethodsThis is a single-center, retrospective study of patient cases with a primary ingestion of oral iron reported to a Regional Poison Center from January 1, 2015 to January 1, 2020. The primary outcome is the incidence of an iron concentration of 350 μg/dL or greater at or beyond 4 h. Secondary outcomes include the incidence of antidotal deferoxamine administration, incidence of iron concentration > 500 μg/dL, incidence of positive findings on abdominal radiography, and time to highest reported iron concentration. ResultA total of 75 patients were included in this study. No patients who developed at most minor symptoms (abdominal discomfort, nausea, vomiting, or diarrhea without evidence of systemic toxicity) and had a 2-to-4-h concentration ≤ 300 μg/dL symptoms had a subsequent concentration > 350 μg/dL (negative predictive value [NPV] 100 %). Deferoxamine was used to treat five patients, all reached concentrations of >300 μg/dL 2–4-h post-ingestion. ConclusionPatients with only minor GI symptoms and an iron concentration of ≤300 μg/dL between 2 and 4-h post ingestion are unlikely to develop further toxicity. In this case series, a concentration of 300 μg/dL or less between 2 and 4 h was the ideal cutoff to predicting subsequent potentially toxic concentrations, with a sensitivity of 100 % and a specificity of 54 %.