Abstract

In this review a broad overview of historical and current methods for the assessment of iron bioavailability was given. These methods can be divided into iron solubility studies, iron absorption studies, endpoint measures, and arithmetic models. The pros and cons of all methods were discussed. First, studies on in vitro and in vivo iron solubility have been described. The disadvantages of iron solubility include the impossibility of measuring absorption or incorporation of iron. Furthermore, only the solubility of nonheme iron, and not heme iron, can be studied. Second, we focused on iron absorption studies (either with the use of native iron, radioiron or stable iron isotopes), in which balance techniques, whole-body counting or postabsorption plasma iron measurements can be applied. In vitro determination of iron absorption using intestinal loops or cell lines, was also discussed in this part. As far as absorption studies using animals, duodenal loops, gut sacs or Caco-2 cells were concerned, the difficulty of extrapolating the results to the human situation seemed to be the major drawback. Chemical balance in man has been a good, but laborious and expensive, way to study iron absorption. Whole-body counting has the disadvantage of causing radiation exposure and it is based on a single meal. The measurement of plasma iron response did not seem to be of great value in determining nutritional iron bioavailability. The next part dealt with endpoint measures. According to the definition of iron bioavailability, these methods gave the best figure for it. In animals, the hemoglobin-repletion bioassay was most often used, whereas most studies in humans monitored the fate of radioisotopes or stable isotopes of iron in blood. Repletion bioassays using rats or other animals were of limited use because the accuracy of extrapolation to man is unknown. The use of the rat as a model for iron bioavailability seemed to be empirically based, and there were many reasons to consider the rat as an obsolete model in this respect. The double-isotope technique was probably the best predictor of iron bioavailability in humans. Disadvantages of this method are the single meal basis and the exposure to radiation (as far as radioisotopes were used). Finally, some arithmetic models were described. These models were based on data from iron bioavailability studies and could predict the bioavailability of iron from a meal.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.