Abstract

After completing this article, readers should be able to: In the March and April issues of Pediatrics in Review, we published a two-part article on managing anemia in a pediatric office practice. This article expands on the various tests for iron deficiency, including some relatively new ones. These articles should be read as complementary.—RJHIron deficiency is the most common nutritional deficiency in the world, responsible for a staggering amount of ill health, lost productivity, and premature death. Although its prevalence in the United States has declined since the late 1960s, iron deficiency with or without anemia still is seen frequently in infants, toddlers, adolescent females, and women of childbearing age. In fact, iron deficiency anemia remains the most common hematologic disease of infants and children.Anemia is defined as a low hemoglobin (Hgb) concentration or red blood cell (RBC) mass compared with age-specific norms. Anemia may be caused by decreased RBC production, increased RBC destruction, or blood loss. Based on the size of the RBC, hematologists categorize anemia as macrocytic, normocytic, or microcytic.Iron is found in different compartments within the body. Total body iron (measured by ferritin), transport iron (measured by transferrin saturation), serum iron, and other hematologic and biochemical markers are used to describe the degrees of iron deficiency. Iron depletion refers to the earliest stage of diminishing iron stores in the setting of insufficient iron supply. Iron deficiency (without anemia) develops as these iron stores are depleted further and begin to impair Hgb synthesis. Finally, iron deficiency anemia results when the iron supply is insufficient to maintain normal levels of Hgb.According to current World Health Organization estimates, most of the world’s population may be iron-deficient, and at least one third (approximately 2 billion people) have anemia due to iron deficiency. As recently as the late 1960s, iron deficiency with or without anemia was highly prevalent in the United States. In 1971, the American Academy of Pediatrics Committee on Nutrition promoted the early use of iron-fortified formula instead of cow milk. One year later, the federal government introduced the Special Supplemental Food Program for Women, Infants and Children (WIC) Act to address iron and other nutritional deficiencies.These initiatives have had a tremendous impact on the health and well-being of children. In one review of children ages 6 to 60 months who were participating in public health programs (such as WIC), the prevalence of anemia declined from 7.8% in 1975 to 2.9% in 1985.Unfortunately, iron deficiency with or without anemia remains relatively common in the United States. In the third National Health and Nutrition Examination Survey (1988 to 1994), 13% of 1-year-olds, 5% of 2-year-olds, 9% of adolescent females (12 to 15 y), and 11% of women of childbearing age (16 to 49 y) were iron-deficient. Iron deficiency anemia was present in 3% of the toddlers studied, 2% of the adolescent females, and 3% to 5% of the women of childbearing age. Despite efforts at prevention and early detection, severe cases still occur.The high prevalence of iron deficiency anemia in developing countries most often is attributed to nutritional deficiencies worsened by chronic blood loss due to parasitic infections and malaria. In the United States and other industrialized nations, the most common cause of iron deficiency with or without anemia is insufficient dietary iron. Infants, toddlers, adolescents, and pregnant women are particularly susceptible because of their relatively rapid growth and increased demand for iron.The use of iron-fortified formula helps ensure adequate iron supplies for infants. However, toddlers often have diets that contain minimal amounts of iron-rich foods and large amounts of cow milk. The early introduction of whole cow milk (before 1 year of age) and consumption of greater than 24 oz of whole cow milk per day (after the first year of life) increase the risk of iron deficiency. Cow milk is low in iron and interferes with iron absorption. In addition, cow milk may cause occult gastrointestinal bleeding in some infants.Adolescent females may become anemic due to menstrual losses. Some children develop anemia due to other causes of blood loss, such as Meckel diverticulum, chronic epistaxis, or inflammatory bowel disease.Anemia seen during the first 2 to 3 months of life, termed physiologic anemia of infancy, is not due to iron deficiency and, therefore, does not respond to iron therapy. In preterm infants, this physiologic anemia, also called anemia of prematurity, appears at 1 to 2 months of age and is often more severe.Iron, which is present in trace amounts in every cell in the body, performs several vital functions, including oxygen transport. Most of the body’s iron is used to make heme groups within the oxygen-carrying molecules Hgb and myoglobin. Iron also is essential for the biologic function of cytochromes and other enzymes involved in cellular respiration.Iron is absorbed from the gastrointestinal tract and transported in the blood bound to transferrin. Excess iron is stored primarily in the liver, bone marrow, and spleen as ferritin.The developing fetus builds iron stores from maternal supplies. Unless maternal iron deficiency is severe, a normal term infant is born having sufficient iron stores for at least 4 to 6 months of postnatal growth. During the first months of life, the newborn uses iron at a high rate for accelerated growth and expansion of blood volume. By 4 months of age, an infant’s iron stores have decreased by 50% (and birthweight usually has doubled). The preterm infant has less time to accumulate iron in utero and, therefore, is born with lower iron stores. In addition, the preterm infant has a demonstrably faster rate of postnatal growth than the term infant and may deplete iron stores within 2 to 3 months.Adequate iron must be available to meet these demands. Although the majority of iron in the body is conserved and reused, some is lost through the gastrointestinal tract, skin, and urine. During the first year of life, normal infants need to absorb approximately 0.8 mg/d of dietary iron (0.6 mg for growth, 0.2 mg to replace ongoing losses).Toward the end of the second year of life, this swift rate of growth begins to slow, so routine diets tend to include sufficient iron-rich foods to meet demands. Iron requirements increase again during adolescence due to rapid growth; adolescent females need additional iron to replace losses from menstruation.There are two types of dietary iron: heme and nonheme. Heme iron already has been incorporated into the heme molecules of Hgb and myoglobin and is well absorbed by the body. Approximately 10% of the iron in a typical Western diet is heme iron, derived from meat, poultry, and fish. The majority of dietary iron is nonheme, in the form of iron salts. The bioavailability (amount absorbed by the body) of nonheme iron is highly variable and influenced by several factors, including current diet and the amount of iron already present in the body. Bran, dietary fiber, calcium, tannins (in tea and coffee), and oxalates, phytates, and polyphenols (in certain plant-based foods) inhibit iron absorption. Absorption is enhanced by reducing substances such as hydrochloric acid and ascorbic acid. The consumption of heme iron, even in small amounts, enhances the absorption of nonheme iron. Absorption of iron also is increased when total body stores are decreased or when the demand for iron increases, such as during adolescent growth spurts. Mature human milk and cow milk contain the same amount of iron, approximately 0.5 mg/L; fortified formulas contain 10 to 13 mg/L. However, about 50% of iron from human milk is absorbed compared with only 10% from cow milk and less than 5% from iron-fortified formula. The reasons for the enhanced bioavailability of iron from human milk are not well understood, but they include a lower concentration of calcium and a higher concentration of ascorbic acid in human milk. The signs and symptoms of iron deficiency with and without anemia depend on the degree of deficiency and the rate at which the anemia develops. Children who have iron deficiency or mild-to-moderate anemia may show few, if any, signs or symptoms. Pallor is the most frequent sign of iron deficiency anemia. As the degree of anemia worsens, fatigue, exercise intolerance, tachycardia, cardiac dilatation, and systolic murmurs may develop. Splenomegaly can be found in 10% to 15% of affected patients. Infants and toddlers may demonstrate irritability and anorexia. However, even severe anemia may be asymptomatic; in one study of severe cases, 45% were diagnosed incidentally.Iron deficiency anemia in infancy and early childhood is associated with developmental delays and behavior disturbances that may be irreversible. Numerous studies have documented lower test scores of mental and motor development among infants who had iron deficiency and iron deficiency anemia. In some follow-up studies, test results were normal after reversal of the anemia, but in others, developmental delays persisted, despite adequate treatment. The extent and persistence of brain involvement seem to depend on the age at which anemia first develops as well as its degree and duration. Although additional study is needed, the evidence linking iron deficiency and cognitive impairment is compelling. Iron supplements even have been shown to improve learning and memory in nonanemic iron-deficient adolescent females.Iron deficiency anemia also is associated with poor growth and may produce other systemic abnormalities, such as blue sclerae, koilonychia, angular stomatitis, increased susceptibility to infection, and functional alterations in the gastrointestinal tract. Iron deficiency increases lead absorption and has been associated with pica, which may result in plumbism.The differential diagnosis for anemia in children is broad, but it narrows once the anemia is classified further as microcytic. Iron deficiency and thalassemia minor are the most common causes of microcytic anemia in children. Microcytosis also results from lead poisoning, chronic disease (eg, inflammation, infection, cancer), sideroblastic anemia, and other rare conditions.An array of tests can be used for evaluating anemia, but there is no single “best” test to diagnose iron deficiency with or without anemia. The “gold standard” for identifying iron deficiency is a direct test—bone marrow biopsy with Prussian blue staining. However, bone marrow aspiration is too invasive for routine use, so indirect assays generally are used. Hematologic tests are based on RBC features (eg, Hgb, mean corpuscular volume [MCV]), and biochemical tests are based on iron metabolism (eg, zinc protoporphyrin [ZPP], serum ferritin concentration). Hematologic tests generally are more readily available and less expensive than biochemical tests. However, biochemical tests detect iron deficiency before the onset of anemia and, therefore, may be worth the additional expense because the deleterious effects of iron deficiency appear to begin before anemia develops. A new hematologic test, reticulocyte hemoglobin content (CHr), may help diagnose iron deficiency before anemia is present.The various hematologic and biochemical parameters used for screening and diagnosis are discussed below. Tables 1and 2 summarize the values for these parameters along the spectrum from normal to iron deficiency anemia. In most cases, the results from several tests are necessary to make a definitive diagnosis.Measurement of Hgb, the concentration of oxygen-carrying protein, is a more sensitive and direct test for anemia than is measurement of hematocrit (Hct), the percentage of whole blood that is occupied by RBCs. Anemia generally is defined as Hgb values below the 5th percentile in a healthy reference population: less than 11.0 g/dL (110 g/L) for infants 6 months to 2 years of age. Both measurements are inexpensive, readily available tests for anemia and are used most commonly to screen for iron deficiency. However, Hgb and Hct are late markers of iron deficiency, are not specific for iron deficiency anemia, and are less predictive as the prevalence of iron deficiency anemia decreases.The MCV, the average volume of RBCs, is reported in automated analyses, but it also can be calculated as the ratio of Hct to RBC count. MCV is useful for categorizing anemia as microcytic, normocytic, and macrocytic.The red blood cell distribution width (RDW) measures variations in the size of RBCs and increases with iron deficiency. In one study of adults, high RDW (>15%) was 71% to 100% sensitive and 50% specific in diagnosing iron deficiency. Another study of 12-month-old infants found that high RDW (>14%) was 100% sensitive and 82% specific. Because of its relatively low specificity, RDW is not as useful alone as a screening test, but it is used frequently in conjunction with MCV to differentiate among various causes of anemia. For example, RDW is high in iron deficiency anemia, but low in thalassemia minor.The reticulocyte count measures circulating immature RBCs and decreases with iron deficiency. However, the reticulocyte count increases with blood loss. In severe cases of iron deficiency anemia coupled with blood loss, the reticulocyte count may be slightly elevated. This parameter often is used for assessing the response to iron supplements.CHr, the concentration of iron-containing protein in reticulocytes, can be measured in some hematology laboratories by using the same automated flow cytometer that provides RBC and reticulocyte indices. CHr has been shown to be an early indicator of iron deficiency in healthy subjects receiving recombinant human erythropoietin. A retrospective laboratory analysis performed on 210 children showed that low CHr was the best predictor of iron deficiency compared with Hgb, MCV, serum iron, RDW, and transferrin saturation.Ferritin is a storage compound for iron, and serum ferritin levels normally correlate with total iron stores. As iron stores are depleted, serum ferritin levels decline and are the earliest marker of iron deficiency. Serum ferritin has high specificity for iron deficiency, especially when combined with other markers such as Hgb. However, the test is expensive and has limited availability in a clinic setting; therefore, it is not used commonly for screening. In addition, serum ferritin is an acute-phase reactant that can become elevated in the setting of inflammation, chronic infection, or other diseases.Serum iron concentration can be measured directly and generally decreases as iron stores are depleted. However, serum iron may not reflect iron stores accurately because it is influenced by several additional factors, including iron absorption from meals, infection, inflammation, and diurnal variation.Total iron-binding capacity (TIBC) measures the availability of iron-binding sites. Extracellular iron is transported in the body bound to transferrin, a specific carrier protein. Hence, TIBC indirectly measures transferrin levels, which increase as serum iron concentration (and stored iron) decreases. Unfortunately, this test also is affected by factors other than iron status. For example, TIBC is decreased with malnutrition, inflammation, chronic infection, and cancer.Transferrin saturation (Tfsat) indicates the proportion of occupied iron-binding sites and reflects iron transport rather than storage. Tfsat is calculated from two measured values: serum iron concentration divided by TIBC, expressed as a percent. Low Tfsat implies low serum iron levels relative to the number of available iron-binding sites, suggesting low iron stores. Tfsat decreases before anemia develops, but not early enough to identify iron depletion. Tfsat is influenced by the same factors that affect TIBC and serum iron concentration and is less sensitive to changes in iron stores than is serum ferritin.Serum transferrin receptor (TfR) also can be detected in some laboratories via immunoassay. This receptor is present on reticulocytes and is shed from the membrane as the reticulocyte matures. With tissue iron deficiency, there is a proportional increase in the number of transferrin receptors. Although not a readily available test, TfR is useful as an early marker of iron deficiency, but it also may differentiate between iron deficiency anemia and anemia of chronic disease. In one study, TfR was increased in patients who had iron deficiency anemia, but not in patients who had anemia in the setting of acute infection.ZPP is formed when zinc is incorporated into protoporphyrin in place of iron during the final step of heme biosynthesis. Under normal conditions, the reaction with iron predominates, but when iron is in short supply, the production of ZPP increases and the ZPP/heme ratio becomes elevated. ZPP/heme reflects iron status during hemoglobin synthesis and detects iron deficiency before the onset of anemia. This test is reported most accurately as the ZPP/heme ratio, but it also is reported simply as ZPP. ZPP is not the same as free erythrocyte protoporphyrin (FEP) or erythrocyte protoporphyrin, which is created in the laboratory when zinc is stripped from ZPP and also is used as a marker of iron deficiency without anemia. Although ZPP and FEP can be measured by using affordable, clinic-based methods, both are elevated with lead poisoning and chronic disease, making them less useful for the diagnosis of anemia.Dietary history may be suggestive of iron deficiency and has been studied as a possible marker for microcytic anemia. In one study of healthy inner-city children between the ages of 15 and 60 months, dietary iron deficiency was defined as: 1) fewer than 5 servings per week each of meat, cereals or bread, vegetables, and fruit; 2) more than 16 oz per day of milk; or 3) daily intake of fatty snacks or sweets or greater than 16 oz of soda. As a screening test for microcytic anemia, the study found that diet history had 71% sensitivity, 79% specificity, and 97% negative predictive value. Similarly, low specificity was demonstrated in another prospective study that used a questionnaire to assess diet, WIC participation, and medical and family history. Response to a clinical trial of iron therapy is used by most clinicians as a practical method of diagnosing iron deficiency anemia.Because of the low specificity of dietary history for iron deficiency anemia, it cannot eliminate the need for further laboratory testing. However, dietary history may be useful in identifying children at low risk for iron deficiency (negative predictive value) and is essential for the prevention and management of iron deficiency anemia.Clinicians play an essential role in preventing iron deficiency and its related medical and developmental problems. Primary prevention involves counseling at routine health supervision visits from infancy through adolescence to help ensure that ingestion of dietary iron is adequate (Table 3). Secondary prevention involves regular screening, prompt diagnosis, and appropriate treatment of iron deficiency.Hgb and Hct are the most commonly used screening tests for iron deficiency. They are readily available and cost-effective markers of anemia. However, the use of anemia as a marker for iron deficiency depends on the prevalence of iron deficiency anemia in a population. Increased prevalence improves the positive predictive value of anemia as a screening test for iron deficiency.The American Academy of Pediatrics recommends universal screening with Hgb or Hct once between ages 9 and 12 months of age and again 6 months later in communities and populations that have a high prevalence of iron deficiency anemia, including children eligible for WIC, children of migrant workers, or recently arrived refugee children. For communities that have low rates of anemia, selective screening at the same intervals is recommended for children at risk for iron deficiency, including preterm or low-birthweight infants, infants fed a diet of noniron-fortified infant formula, infants introduced to cow milk before age 12 months, breastfed infants who are receiving inadequate dietary iron after age 6 months, and children who consume more than 24 oz of cow milk per day.After 2 years of age, routine screening usually is not necessary. Risk can be assessed regularly and children screened who have a previous history of iron deficiency, evidence of low iron intake, or special health needs that increase the risk for iron deficiency (eg, chronic infection, inflammatory disorders, chronic or acute blood loss, restricted diets, use of medications that interfere with iron absorption).The American Academy of Pediatrics recommends screening all adolescents once between ages 11 and 21 years and screening menstruating females annually. The Centers for Disease Control and Prevention recommend annual screening of adolescent females if their risk is increased (eg, excessive menstrual blood loss, low iron intake, previous diagnosis of iron deficiency); otherwise, anemia should be screened for every 5 to 10 years. In the appropriate clinical setting, an abnormally low Hgb or Hct combined with a dietary history of low iron intake strongly suggests iron deficiency anemia. Further laboratory testing, such as measurement of serum ferritin, will help to confirm the diagnosis, but in most cases is not necessary. Response to a therapeutic trial of supplemental iron is considered clinically diagnostic. If a child who has a normal diet that contains adequate servings of iron-rich foods is anemic, additional evaluation may be indicated to look for blood loss (eg, occult rectal bleeding). Presumptive iron deficiency is treated with oral iron salts, most commonly over-the-counter ferrous sulfate, which is inexpensive and relatively well absorbed. Dosages are calculated for elemental iron: children receive 3 to 6 mg/kg per day (qd or tid), and adolescents receive 60 mg/dose (qd or bid). If the iron deficiency is nutritional, the response to iron typically is rapid. Parenteral iron can be given if oral iron is not tolerated; intramuscular iron injections usually are not appropriate. Erythrocyte transfusion should be used only if the anemia is causing severe cardiovascular compromise; hypervolemia and cardiac dilatation may result from rapid correction of the anemia.After 1 month of therapy, the Hgb measurement should be repeated. An increase of 1 g/dL (10 g/L) or greater confirms the diagnosis of iron deficiency anemia. No improvement in Hgb should prompt further evaluation of the anemia with additional laboratory tests, including MCV, RDW, and serum ferritin, and a search for possible sources of blood loss. Iron therapy should be continued for an additional 2 to 3 months after Hgb has returned to a normal level, and Hgb should be remeasured approximately 6 months after discontinuation of iron therapy.The evidence is clear that early diagnosis and adequate treatment of iron deficiency are critical to prevention or reversal of any negative medical or behavioral effects. As advocates for children, pediatricians must screen for this common nutritional deficiency actively and accurately. Hgb and Hct are the most readily available and cost-effective screening tests, but newer tests that detect iron deficiency before the onset of anemia (eg, CHr) are being studied prospectively in healthy infants and may gain widespread acceptance, particularly as the prevalence of iron deficiency anemia decreases.The early introduction of whole cow milk (ie, before 1 year of age) and consumption of greater than 24 oz/d after the first year of life increase the risk of iron deficiency.Zinc protoporphyrin reflects iron status during hemoglobin synthesis and detects iron deficiency before the onset of anemia.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call