Abstract

Despite a plethora of papers, reports and consensus statements during the last 25 years concerning the high prevalence and complications of iron deficiency (ID), the problem is still with us. A recent national study in Britain showed that 12% of 2-year-olds were anaemic, rising to 29% in Asian immigrant groups (Lawson et al, 1998). What can be done about it? This paper reviews detection and prevention of iron deficiency anaemia (IDA) referring mainly to studies published in the last 5 years. Earlier substantial reviews of iron nutrition are available (Brock et al, 1994; British Nutrition Foundation, 1995; Hallberg & Asp, 1996). Reviews specifically related to children include those by Oski (1993) and Wharton (1999a). Detection is divided into indications for investigation, which investigations to apply, and their interpretation. There are obvious indications for determining iron status in some clinical presentations. An example is suspected malabsorption. One study found that an oral iron absorption test was more sensitive as a screening test for upper intestinal absorption than the commonly used D-xylose method (Stahlberg et al, 1991; De Vizia et al, 1992). Other deficiencies often coexist with ID partly because a poor diet may have many deficiencies but also because of micronutrient interaction, e.g. ID with deficiencies of vitamin A or D (Gujral & Gopaldesas, 1995; Underwood & Arthur, 1996; Wharton, 1999b). ID may play a role in or complicate such diverse disorders as ischaemic stroke, apparent asthma, cyanotic heart disease and gastric trichobezoar (Hartfield et al, 1997; Hetzel & Losek, 1998; Olcay et al, 1996; Phillips et al, 1998). In my view any child reaching hospital as an outpatient or inpatient should have their haemoglobin level and RBC indices determined. Some specialities argue this is unnecessary since the haemoglobin distribution in their patients is no different to that in the general population (e.g. in otolaryngology; Heaton et al, 1991). This seems a lost occasion for opportunistic screening for a common disorder. Clinical signs are helpful only in severe anaemia, but surveys show that pale conjunctivae (sensitivity 74%) and nail beds (specificity 96%) are useful signs (Thaver & Baig, 1994). Blue sclerae might be an extra sign (Beghetti et al, 1993). The highest prevalence of iron deficiency anaemia (IDA) occurs in toddlers and adolescents because the increment in haemoglobin iron per unit body weight is greatest at these ages (see Fig 1). . Changes in body iron during development: (a) total body iron and haemoglobin iron (mg) in males except where shown; (b) daily increment in body iron (mg/d): —-, male; - - - -, female; ...., female plus menstrual loss; (c) proportional daily increment in body iron (μg/kg/d); symbols as for (b). Two points should be noted. There is little increase in total body iron in the first 4 months or so of life. As the haemoglobin falls from around 18 g/dl at birth to 14 g/dl during the first 2 weeks of life the liberated iron is stored and then gradually reused as the total mass of circulating haemoglobin begins to increase with growth. Between 4 and 12 months total body iron increases by about 130 mg, and an external source of iron is necessary. If not met, ID occurs and frank anaemia develops usually after the first birthday. Note also that boys need more iron at adolescence because of the increase of muscle and myoglobin. Subsequently these increased requirements due to changes in body composition subside, but increased requirements continue in girls following menarche. Infants who continue to receive only breast milk after the first 6 months of life are at increased risk. Breast feeding may continue after 6 months without difficulty so long as other foods providing available iron are introduced. Also at risk are infants who, despite current policy, are changed from an infant formula to whole cows' milk before the age of 1 year. It is not clear whether the higher prevalence of IDA in these infants is mainly the effect of an inadequate intake of dietary iron or due in addition to increased intestinal iron loss (Ziegler et al, 1990; Fuchs et al, 1993a, b). In toddlers attending well child facilities in Cleveland, U.S.A., a simple dietary history predicted microcytic anaemia (sensitivity 71%, specificity 79%), but a quarter of the anaemic children were not identified (Boultry & Needlman, 1996). A community study in Sydney found a low consumption of meat (i.e. haem iron) and introduction of whole cows' milk before the first birthday were significant indicators of ID (Mira et al, 1996). Many adolescent girls try to control their weight and inadvertently limit iron intake. This was particularly marked 10 years ago in British girls who bought snacks from local shops rather than eating school lunch or food from home, but there has been evidence of improvement since then (Department of Health, 1989; Moynihan et al, 1994; Southon et al, 1994; Doyle et al, 1994). Many adolescents pass through a temporary period of vegetarianism because of concerns with animal welfare. Although adequate iron nutrition is achievable on a vegetarian diet it must provide iron sources (such as pulses) and enhancers of absorption (e.g. vitamin C and fish or poultry if acceptable), and the temporary amateur vegetarianism may not ensure sufficient absorbed iron. Preterm babies are born with a lower concentration of haemoglobin, so any physiological haemolysis liberates less iron for stores; erythropoietin, if given, increases iron requirements, and so does catch up growth. Light for gestational age babies often have a raised haemoglobin at birth reflecting intrauterine hypoxia, and so initially, post haemolysis iron stores are higher but the rapid catch-up growth increases demands. In a normal term baby the total haemoglobin mass doubles during the first year of life (from 180 mg at birth to 340 mg at 1 year). In a preterm 1 kg baby the increase is 6-fold (50–300 mg). In a 2 kg baby born at term the increase is 3-fold (110–330 mg). Children of immigrants or refugees have a higher prevalence of iron deficiency, presumably due to such factors as socio-economic deprivation (living in inner city areas with overcrowding and limited parental income), language difficulties (health education is difficult), unfamiliarity with foods available in the new environment (often a tendency to rely on milk and puddings), food customs which are difficult to follow (e.g. halal meat for Muslims may not be easily available and so children are given a meat-free diet by a mother inexperienced in providing a balanced vegetarian diet). In a nationally representative survey of British 1.5–2.5-year-olds 12% had IDA but among children of immigrant families it was higher: India (20%), Pakistan (27%) and Bangladesh (29%) (Lawson et al, 1998). Other recent reports describe the problem in children from South-East Asia, Latin America and Eastern Europe living in U.S.A., Norway and Switzerland (Graham et al, 1997; Sargent et al, 1996). Athletic performance, particularly endurance sport, may lead to blood loss from the gut and urinary tract (Robertsson et al, 1987; Haymes & Lamanca, 1989). Therefore athletic girls who have passed menarche and are trying to slim may be at particular risk of iron deficiency. The staging of iron status by Oski et al (1983) is a useful concept and various measurements can be used to define the stages. Iron stores and erythropoiesis normal. Erythropoiesis normal but iron stores reduced (serum ferritin <12 μg/l) indicating a reduction of iron in the bone marrow, liver and other parts of the reticuloendothelial system (note that the exact cut-off point for normal/abnormal ferritin depends on the method used; a reference ferritin preparation to calibrate the assay is recommended). (i) Abnormal RBC biochemistry (free erythrocyte protoporphyrin (EPP) >99 mmol/mol haem; serum transferrin receptor raised, e.g. >8.5 mg/l but exact cut-off depends on age and the assay used); (ii) abnormal RBC morphology (microcytosis, MCV <80 fl, varying with age; anisocytosis, RDW > 15%); (iii) transport iron reduced (transferrin saturation <10%). The above plus haemoglobin <11 g/dl. There is no evidence that iron depletion or iron-deficient erythropoiesis alone have any adverse clinical effects, whereas iron deficiency anaemia is associated with alterations of immunological, gut and mental function. In the recent NHANES survey in the U.S.A. (Dallman et al, 1996) ID was defined as the presence of two or more abnormal measurements as shown in Table I. Although one can argue about the exact cut-off points used and the need for two abnormal characteristics, this battery of tests was applied to a large number of children and the haemoglobin ranges produced (i.e. after excluding children with more than one abnormal test) are suitable reference standards (see Table I). A recent survey in Bristol has suggested a cut-off point for haemoglobin concentration in 12- and 18-month-olds as low as 10 g/dl, but no attempt was made to exclude iron-deficient children and the haemoglobin method used was the Hemocue B-Hb photometer (Sherriff et al, 1999). It would be impractical, however, to use initially the whole battery of investigations shown in Table I and simpler approaches for population and individual studies have been suggested. Yip et al (1996) suggested that haemoglobin concentrations alone could be used in populations. The distributions of haemoglobin are determined in children and adults. If the distribution is moved to the left, in both children and women of child bearing age but not in men, then iron deficiency is likely. If the distribution is moved to the left in men as well, then probably other factors are operating as well, e.g. malaria or hookworm. This approach has been used to diagnose dietary ID in Pakistan, iron losses from hookworm in Zanzibar, and iron losses (from undetermined causes) in Alaskan natives (Petersen et al, 1996). The strategy has been questioned in Thailand for children >5 years of age in whom anaemia was rarely associated with a low plasma ferritin (Linpisarn et al, 1996). Electronic counters based on impedence or light scattering are in common use in developed countries. The likelihood of iron deficiency may then be assessed from the indices. Increasing use is now made of histogram distributions of red blood cell volume rather than using only arithmetic summaries of size and variation in size such as mean corpuscular volume (MCV) and red cell distribution width (RDW). With some methods ‘red cell cytograms’ are also available in which red cell volume is plotted against red cell haemoglobin concentration for all red cells counted. Walters & Abelson (1996) have described the interpretation of full blood count and indices, possible artefacts (due to cold agglutinins, high white cell counts, and hyperosmolar plasma), crude checks for internal consistency (haemoglobin in g/dl about 3 × RBC; calculated and correct PCV about 3 × haemoglobin) and its interpretation in children. Hinchcliffe & Helliwell (1993) have described the use of distribution histograms and cell cytograms in children. Typically in iron deficiency anaemia Hb and MCV are reduced, RDW is increased (i.e. microcytosis and anisocytosis), red cell haemoglobin distribution width (HDW) is increased (i.e. anisochromia), and the ‘shape’ of the cell cytogram scatter is moved down and to the left with a large proportion of cells in the hypochromic microcytic zone. During a response to iron treatment double peaks are seen in the histograms for red cell volume and red cell haemoglobin and the cytogram shows more cells in the normocytic normochromic zone. The application of these more sophisticated methods to population screening has not been evaluated. Mates et al (1995) argued that a full electronic counter blood screen is a ‘watchdog of community health’. In their Israeli series including adults as well as children, 1% had microcytosis, mostly due to iron deficiency (58%) and thalassaemia minor (35%). Similarly Kim et al (1996) recommend that MCV and RDW be routinely determined, and this increases the predictive value for iron deficiency to 98%. Choi & Reid (1998) found RDW a useful predictor of red cell disease in the well baby clinic. EPP alone (or zinc protoporphyrin, ZPP) has been used for screening and as an indication for a therapeutic trial of iron in some American paediatric practices (Benjamin et al, 1991; Siegal & Lagrone, 1994). In iron deficiency zinc fills the iron pocket in the protoporphyrin molecule. ZPP determination requires only 20 μl of blood and is easily measured in a haematofluorimeter. It also remains abnormal for a week or so, even if iron therapy commenced before the test. However, it is also abnormal in the anaemias of inflammation and in lead poisoning. Serum ferritin may also be determined on small blood samples, but careful standardization of methods and use of a reference ferritin preparation for calibration are necessary (Worwood, 1997). It is raised during acute infections, chronic disease and liver disease irrespective of the iron stores, but iron deficiency is the only cause of a low concentration. Serum transferrin receptor concentration has raised considerable interest. The concentration reflects the number of transferrin receptors on immature red cells and so in most instances also reflects the rate of bone marrow erythropoiesis. Iron deficiency, however, also results in an ‘unproportional’ increase in the concentration (Heubers et al, 1990). An increased concentration provides an early and sensitive indicator of functional iron deficiency, sometimes before the plasma ferritin has fallen (Skikne et al, 1990; Worwood, 1995, 1997). A major advantage is that it remains normal in many chronic disorders if iron deficiency is not present. However, it is raised in the thalassaemias even though iron deficiency is not present. As in adults, in infants and 11–12-year-old boys higher concentrations of the receptor were associated with a lower serum ferritin even within the normal physiological range for ferritin (Virtanen et al, 1999). However, its use as an index of iron deficiency in infancy and adolescence has been questioned (Kuiper-Kramer et al, 1998; Kling et al, 1998; Kivivuori et al, 1993; Kuizon et al, 1996). It would be unwise to use the test alone without other measurements of ID as well. Hereditary causes of microcytosis, inflammation and various chronic diseases, and occasionally lead poisoning, may cause difficulties of interpretation. Apart from the thalassaemias, hereditary causes of microcytosis are quite rare. Most are associated with iron overload of tissues, but a small number of children have been described with ID because of a defect in absorption (Table II). The RBC in IDA and the thalassaemias have similar indices. The degree of anisocytosis and hence the RDW is usually higher in IDA, particularly in relation to the degree of microcytosis. Various mathematical ratios of red cell indices have been suggested to help the differentiation. Using cytometry plots, the proportion (%) of hypochromic cells is greater than the proportion of microcytic cells in iron deficiency, whereas the reverse is true in thalassaemia (d'Onofrio et al, 1992). In thalassaemia there also may be an increase in hypochromic macrocytes. When there is any possibility of a thalassaemia, however, it is usually better to proceed directly to haemoglobin electrophoresis and A2 determination, but iron deficiency in association with thalassaemia may temporarily mask the characteristic changes in A2 and HbF. In a United States study a half of people (aged 15–49 years) with beta thalassaemia trait, had a raised ZPP, and so did a quarter of those with haemoglobin E or alpha-thalassaemia trait, suggesting that ZPP may be abnormal in thalassaemia traits, but the exact iron status of the subjects was not defined (Graham et al, 1996). The anaemias of infection and chronic diseases are classically normochromic and normocytic, but hypochromia and microcytosis occur in about a third of infected children. Even after a mild infection many measurements move in the same direction as occurs in ID, but ferritin rises and serum transferrin receptor remains normal (see reviews by Abshire, 1996, and Walter et al, 1997). On the other hand, tropical infection such as malaria did not interfere with the use of EPP and ferritin in the diagnosis if ID of Zanzibari schoolchildren (Stoltzfus et al, 1997). In chronic disease such as rheumatoid arthritis interpretation is difficult. Although measurements suggesting ID may be due to cytokine activity true ID (demonstrated by bone marrow examination) may also occur. Moreover, serum transferrin receptor is raised in a number of patients, suggesting iron deficient erythropoiesis which may respond to intravenous iron saccharate (Cazzola et al, 1996). The anaemia is classically normocytic and normochromic, but ID is often present as well, modifying the RBC morphology. ZPP is raised, not because of insufficient iron to join protoporphyrin to form haem but because ferrochetalase, which catalyses the reaction, is inhibited by lead. There is variable evidence linking iron deficiency with an increased risk of lead poisoning (Sargent et al, 1995; Hammad et al, 1996). Some years ago anaemia with pica would prompt investigation for both disorders, but as environmental lead has decreased the diagnosis is less common. If in doubt, the blood lead levels should be determined and the response to iron noted. Lead diuresis and reduction in blood lead following chelation treatment are less in iron-deficient children (Ruff et al, 1996; Markowitz et al, 1997). Screening is a form of secondary prevention, detecting and treating the disorder early before serious problems occur. If timed at the age of peak incidence (at 18 months), many children would have been anaemic for some months. If introduced earlier, e.g. 13 months at the time of MMR immunization, a number of children not anaemic then become so some months later. In Bristol a quarter of children found to be anaemic at the age of 2 years had not been so at 13 months (James et al, 1993). An ideal age for screening is therefore not apparent. All children in particular groups might be screened if the prevalence of IDA is high in the group, e.g. in inner city areas, children of immigrant or refugee families, exclusively breast-fed 10-month-olds, and toddlers in whom cows' milk was the main drink before 12 months of age. Various bodies have suggested strategies, e.g. government-funded bodies: the Centres for Disease Control (1998) in Atlanta; Department of Health (1994) in U.K.; non-government organizations such as the British and Swedish Nutrition Foundations; many individual nutritionists and paediatricians (Wharton, 1999a; Ziegler & Fomon, 1996). Blood loss should be avoided. Effective umbilical clamping with devices which tighten as the cord withers usually prevent cord haemorrhage. The time of clamping may affect subsequent ID. In Guatemala infants in whom the cord was not clamped until pulsation had stopped had a higher haematocrit at 2 months of age; this manoeuvre had no effect on serum ferritin at 3 months of age in Indian children (Grajeda et al, 1997; Geethanath et al, 1997). Since three-quarters of iron ‘stored’ at birth is in haemoglobin, perinatal blood loss is a potent cause of anaemia in early and later infancy. Generally the other stores in newborns show little relationship to the mothers' iron status, although some studies in both the developed and developing world have shown one. Two papers have shown that poorer maternal iron status in pregnancy is associated with a poorer iron status in the infants at 1 year of age (Strauss, 1996; Colomer et al, 1990). This could reflect a longer-term effect of reduced iron stores or that both mother and child have received an iron-deficient diet. For normal-sized babies there is little concern because total body iron does not increase during this time. If ID occurs then abnormal blood loss should be considered. This may occur in the perinatal period (e.g feto-maternal transfusion, cord accident) or later (e.g. reflux oesophagitis, bleeding from ectopic gastric mucosa in a Meckels diverticulum, rarely allergic colitis of infancy). Breast feeding is encouraged, or failing that a modern infant formula is used. From the age 6 months a dietary source of iron is necessary. For bottle-fed babies this is easily provided by continued use of an infant formula which is iron fortified or introduction of a follow-on formula, all of which are iron fortified. Breast milk alone will not supply the extra iron but absorption of the small amount of iron present is high. This is less critical in bottle-fed babies receiving iron-fortified formula or a follow-on milk, and bottle-fed babies tend to receive weaning foods from an earlier age than breast-fed ones (White et al, 1990). Although there is evidence that too early an introduction of solid foods interferes with iron absorption from breast milk (Pisacane et al, 1995), they should be introduced from 6 months of age. Meat, because of its haem iron, is an excellent choice, providing zinc as well, which may also become a limiting nutrient in prolonged breast feeding. A Danish study showed that an intake of 27 g of meat a day from the age of 8 months (compared to an intake of 10 g daily) led to lower falls of haemoglobin in later infancy, although there were no effects on serum ferritin or transferrin receptor concentrations (Engelmann et al, 1998). Unfortunately many mothers who continue to breast feed their older infants choose vegetarian weaning foods from which iron is less available. Convenience weaning foods are widely available in the Western world, and some are fortified with iron. Wide use of these foods provided a more satisfactory diet (more iron, less protein, salt and sugar) than a home-made diet alone (Mills & Tyler, 1992; Stordy et al, 1995). Work in Honduras questioned the possibility of giving iron supplements to breast-fed children from about 4 months rather than run the risk of introducing microbe-contaminated feeds which anyway have low iron availability (Dewey et al, 1998). In Britain 12–15% of the total iron intake of children 1–15 years old is provided by meat, i.e. 4–5% as haem iron, 20–30% by fortified cereal products such as breakfast cereals and bread; vegetables, biscuits and chips (french-fried potatoes) each supply 5–10%. This diet meets the ‘reference nutrient intake’ (RNI) for most ages but not for toddlers age 1.5–2.5 years (mean intake was 73% of RNI; 100% is desirable), nor for girls aged 10–15 (63% of RNI) (Department of Health, 1989; Gregory et al, 1995). However, the total intake is only part of the story. The absorption of iron is determined by the overall composition of the meal, the integrity of the gastrointestinal tract, and systemic factors. Absorption of haem iron increases when anaemia is present but is little affected by other components of the meal. Absorption of non-haem iron is enhanced by vitamin C, other organic acids present in fruit and vegetables such as citric and malic acid, and animal protein. Absorption is inhibited by phytate, calcium and polyphenols (in tea). For detailed reviews see Lynch (1997) and, specifically for children, Lonnerdal (1990) and Fomon (1993). The extent of iron absorption is also affected by body stores, the rate of erythropoiesis, and hypoxia. The mechanisms whereby enterocytes receive information from these factors to alter absorption are not clear. In developing countries, fortified foods are less available at affordable prices, and fibre and phytate intakes are higher (Tatala et al, 1998). On the other hand, some foods are cooked in iron pots leading to better iron status than if aluminium pots are used (Borigato & Martinez, 1998). Grape molasses is used as a reasonable source of iron in Turkey (Aslan et al, 1997). The role of pasteurized cows' milk in intestinal blood loss has been referred to above. In many parts of the world hookworm infestation is the most common cause of blood loss. There are recommended control programmes (intermittent antihelminthic medication at least twice yearly, control of faecal contamination of soil, use of simple shoes) for schoolchildren and women. These should be applied to preschool children as well, perhaps combined with iron supplementation (Stoltzfus et al, 1997, 1998; Hopkins et al, 1997). Bilharzia is less important as a cause of anaemia, trichuris and giardia have similarly been regarded as less frequent causes of iron deficiency as a population problem (De Morais et al, 1996). With a clear strategy implementation should be a simple matter of informing mothers what is best, but modification of eating customs and traditions is difficult. Small-scale health education interventions have been successful, e.g. in a family practice in Bristol prevalence of microcytic anaemia at 13 months fell from 25% to 8% during a 2-year period, but enthusiasm waned because it had risen to 13% a further 2 years later (James et al, 1993). Larger programmes have not been successful. In a project reaching about 500 children in a health district of Birmingham, about 30% of children in both the intervention and control groups were anaemic at 18 months of age (Childs et al, 1997). Consumption of suitable foods may be encouraged if their price to the consumer is reduced (sometimes free of charge) by subsidies from a government or a charity. The outstanding example is the Women and Infants and Children Program in the U.S.A. (WIC). Iron-fortified infant formulas and weaning cereals are supplied free of charge to about a quarter of all infants. Since the programme was introduced the prevalence of iron deficiency anaemia has fallen considerably and is less than in many other countries (e.g. in U.S.A. the prevalence of IDA among 1–2-year-olds is 3%, Britain 12%; Looker et al, 1997; Gregory et al, 1995). A recent evaluation showed that those within the programme had less anaemia and a better iron status than those who were not (Owen & Owen, 1997). The British scheme, open to families on income support or job-seekers allowance, enables the mother to choose an infant formula (all of which are fortified in Britain) or whole cows' milk which contains little iron. Follow-on formulas, although fortified, are not included nor are ‘solid’ weaning foods. Many other countries have subsidy schemes but many are based on milk alone, which, although excellent for energy, protein, calcium and riboflavine, does little to promote iron nutrition. The main issue in iron fortification of infant formulas is quantity. Most formulas in the U.S.A. contain about 12 mg/l (1.8 mg/100 kcal) and most in Europe up to 7 mg/l (1.0 mg/100 kcal). In both continents formulas without added iron are allowed by the current regulations and in Scandinavia levels are around 4 mg /l (0.6 mg/100 kcal). There are proposals to add little or no iron to formulas consumed in the first 4–6 months of life (Wharton, 1989, 1996). (a) Total body iron increases little during this time, breast milk contains only small amounts of iron, and iron may have adverse effects on the faecal flora (Balmer & Wharton, 1991; Mevissen-Verhage et al, 1985). (b) With higher fortification the absolute amount absorbed is only a little greater, leaving more unabsorbed iron in the gut lumen. (c) Young infants receiving an infant formula with no or small amounts of added iron in the first 4 months of life do not develop ID (Haschke et al, 1993; Hernell & Lonnerdal, 1996). Nevertheless, almost all infant formulas used at this age do contain added iron. From about 6 months of age more dietary iron becomes essential and a ‘safety net of fortified foods’ ensures a satisfactory intake (Wharton, 1986). The safety net usually includes a fortified formula but the appropriate level of fortification is unclear, e.g. continued use of a European infant formula (1 mg/100 kcal) or an American infant formula (1.8 mg/100 kcal ) or introduction of a European style ‘follow-on formula’ (about 1.8 mg/100 kcal). Any of these options is preferable to the early introduction of ordinary cows' milk. Evidence is accumulating that even in formulas consumed in later infancy a level of fortification lower than used previously can be effective in maintaining adequate absorption and/or iron status, e.g. 8 mg (Fomon et al, 1997), 3 mg (Haschke et al, 1993) 2 mg (Walter et al, 1998), but the periods of surveillance were for 3–6 months only and did not follow infants into the second year of life when anaemia is most common. Using radiolabelled iron infant formulas in adults, the Chilean group suggested fortification of 7 mg/l to provide 1 mg of absorbed iron (Hertrampf et al, 1998). A second issue is which qualities of an infant formula enhance iron status. Many studies have shown the positive effect of using iron-fortified formulas instead of cows' milk on iron status in infants >6 months old and in toddlers in the second year of life. It is not certain, however, to what extent this reflects solely the effect of a greater intake of iron, or is due also to ‘other qualities’ of an infant/follow-on formula such as the greater absorption of iron because of the higher vitamin C content, less inhibition of absorption because of the lower concentrations of protein, calcium and phosphorus (although one study found that the addition of calcium glycerophosphate did not adversely affect iron status; Dalton et al, 1997), or less immunologically induced milk enteropathy and iron loss. Probably both the iron fortification and the ‘other qualities’ are operating. Three British studies of the use of follow-on formulas or cows' milk from the age of 6 months support that conclusion. ID was least in those receiving an iron fortified formu

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call