Paläopathologische Befunde am hallstattzeitlichen Menschen der Oberpfalz. Rückschlüsse auf seine Umwelt
Pathological findings from early Iron Age inhumation burials from three cemeteries of the Hallstatt Period (Beilngries, Dietfurt and Schirndorf) in the Upper Palatinate (Bavaria) were compiled. Particular stress was laid upon possible conclusions concerning environmental conditions. The infrequency of cribra orbitalia demonstrates that the studied groups did not suffer from a deficiency of iron. The uncommon fractures of the extremity bones and lesions of the skull give at present the image of a peaceful era during the Hallstatt period in the Upper Palatinate. Caries could be ascertained by 5.4% of the adults from Dietfurt, by 4.9% of those from Schirndorf. In the Dürrnberg the percentage of such cases was lower (2.2%). This can be explained on the one side by the earlier death-rate climax on the Dürrnberg and on the other by differing nutritional habits. The teeth of the adult population from Dietfurt show enamel hypoplasia in only 14.9% of all cases. This low percentage gives cause to suppose that sufficient Vitamin A, C and D were present in the nutrition of the studied populations.
- Front Matter
12
- 10.1053/j.ajkd.2009.12.027
- Mar 30, 2010
- American Journal of Kidney Diseases
Bone Marrow Iron in CKD: Correlation With Functional Iron Deficiency
- Research Article
123
- 10.1097/00005176-200204000-00003
- Apr 1, 2002
- Journal of Pediatric Gastroenterology and Nutrition
*University of Lancashire, Lancashire, United Kingdom; †University of Milano, Milano, Italy; ‡University of Lund, Malmo, Sweden; §Hopital des Enfants Malades, Paris, France; Hopital Necker Enfants-Malades, Paris, France, ¶Umea University, Umea, Sweden; #University of Munich, Munich, Germany; **Free University of Amsterdam, Amsterdam, The Netherlands; ††Royal Veterinary and Agricultural University, Frederiksberg, Denmark; ‡‡CHUV University Hospital, Lausanne, Switzerland; §§University of Liege, Liege, Belgium; Medical University of Warsaw, Warsaw, Poland; and ¶¶University of Glasgow, Glasgow, United Kingdom
- Book Chapter
4
- 10.1093/oso/9780198798118.003.0010
- Apr 27, 2017
The Late Bronze Age Urnfield Period in Central Europe (BA D, Ha A/B, c.1300 to 800 BC) is characterized by the dominance of cremation as a burial rite. The simple appearance of urn burials give an impression of simplicity, but they are the endpoint of a chain of actions and practices that constitute the funerary ritual, many of which may not be simple at all, but include a large number of people and resources. The washing, dressing, and furnishing of the body as it is laid out prior to cremation leave no traces. The funerary pyre, as spectacular as it may have looked, smelled, and felt during the cremation, preserves only under exceptional circumstances. The rituals and feasts associated with selecting the cremated remains from the funerary pyre and placing them in a suitable organic container or a ceramic urn prior to their deposition do not leave much evidence. The large-scale spread of cremation during the Late Bronze Age has traditionally been explained by the movements of peoples (e.g. Kraft 1926; Childe 1950), or a change in religious beliefs (e.g. Alexander 1979). More recently, a change in how the human body is ontologically understood and how it has to be transformed after death is seen as the more likely underlying cause (Harris et al. 2013; Robb and Harris 2013; Sørensen and Rebay-Salisbury in prep.), although a simple and single reason is rarely the driver of such pan-European developments. This chapter will be concerned with another transition, the change from cremation back to inhumation, several hundred years later during the Early Iron Age, and investigates its background and causes. In Central Europe, cremation is given up as the solitary funerary rite, and a range of different options, including inhumations in burial mounds, bi-ritual cemeteries, and new forms of cremation graves emerge. This change happens at a different pace in the various areas of the Hallstatt Culture and adjacent areas, which will be surveyed here. Despite doubts about the validity of the term ‘Hallstatt Culture’ as a cultural entity (e.g. Müller-Scheeßel 2000), it remains a convenient shorthand to the Early Iron Age in Central Europe, c.800–450 BC, in eastern France, southern Germany, Switzerland, Austria, the Czech Republic, Slovakia, Hungary, Slovenia, Croatia, and parts of northern Italy.
- Research Article
99
- 10.2147/ijwh.s51403
- Sep 4, 2013
- International Journal of Women's Health
BackgroundVitamin D insufficiency has been associated with a number of adverse pregnancy outcomes, and has been recognized as a public health concern.AimThe objective of this study was to determine the impact of vitamin D deficiency on maternal complications like gestational diabetes mellitus (GDM), anemia, iron deficiency, and preeclampsia among pregnant women.Subjects and methodsThis was a cohort study undertaken at antenatal clinics at the Women’s Hospital of Hamad Medical Corporation in Doha. A total of 2,487 Arab pregnant women above 24 weeks’ gestation with any maternal complication were approached, and 1,873 women (75.3%) consented to participate in the study. Data on sociodemographic and clinical characteristics by interview and biochemistry parameters were retrieved from medical records. Multivariate logistic regression analysis was performed to determine the associated risk factors.ResultsOf the studied pregnant women, nearly half of them had vitamin D deficiency (48.4%). Younger women below 30 years old (43.2%, P = 0.032), housewives (65.3%, P = 0.008), and those on low monthly household incomes (QR5,000–9,999) (49.2%, P = 0.03) were significantly more likely to have lower vitamin D compared with those who had sufficient vitamin D levels. Exposure to sunlight (63.4%, P = 0.05), daily physical activity (64.4%, P = 0.05), and vitamin D supplement intake (89.7%, P < 0.001) were significantly lower in deficient pregnant women. In the study sample of pregnant women, 13.9% had GDM, 11.5% had anemia, 8.6% had iron deficiency, and 6.9% had preeclampsia. Severe vitamin D deficiency was significantly higher in pregnant women with GDM (16.5% vs 11%), anemia (17.1% vs 11%), iron deficiency (18.5% vs 11.2%), and preeclampsia (19.8% vs 11.4%) when compared to the uncomplicated group. Socioeconomic status was low in pregnant women with complications like GDM, anemia, iron deficiency, and pre-eclampsia. Pregnancy complications like GDM (52.7%), anemia (53.2%), iron deficiency (55.6%), and preeclampsia (51.9%) were higher in Qataris. Also, GDM (66.2%), anemia (66.2%), iron deficiency (68.5%), and preeclampsia (58.1%) were observed more commonly among housewives compared to working women. Obesity was significantly more common in pregnant women with GDM (41.5%) and preeclampsia (41.1%).ConclusionThe study findings revealed that maternal vitamin D deficiency in pregnancy is significantly associated with elevated risk for GDM, anemia, and preeclampsia. The risk of vitamin D deficiency was higher in Qataris, housewives and those with low monthly household income.
- Research Article
14
- 10.1007/s12520-022-01542-1
- Mar 25, 2022
- Archaeological and Anthropological Sciences
This study is first attempt to refine Early Iron Age absolute chronology, specifically the timing of the Hallstatt C-D transition in southern Germany, using Bayesian chronological modelling of radiocarbon (14C) dates. The Hallstatt period (c.800–450 BC) marks the transition from prehistory to proto-history in Central Europe. The relative chronological framework for Hallstatt burials developed by the mid-twentieth century is still used today, but absolute dating is limited by the scarcity of dendrochronological dates and the perception that 14C dating in the Hallstatt period (HaC-HaD) is futile, due to the calibration plateau between c.750 and 400 cal BC. We present new AMS 14C dates on 16 HaC-HaD burials from a stratified sequence at Dietfurt an der Altmühl ‘Tennisplatz’ (Bavaria, Germany). This sequence is situated entirely on the ‘Hallstatt plateau’, but by combining 14C dating with osteological, stratigraphic, and typological information, we demonstrate that the plateau is no longer the ‘catastrophe’ for archaeological chronology once envisaged. Taking into account dendrochronological dating elsewhere, we show that at Dietfurt, the HaC-HaD transition almost certainly occurred before 650 cal BC, and most likely between 685 and 655 cal BC (68.3% probability), several decades earlier than usually assumed. We confirm the accuracy and robustness of this estimate by sensitivity testing. We suggest that it is now possible, and essential, to exploit the increased precision offered by AMS measurement and the IntCal20 14C calibration curve to re-evaluate absolute chronologies in Early Iron Age Europe and equivalent periods in other regions.
- Research Article
- 10.24916/iansa.2023.1.3
- Feb 17, 2023
- Interdisciplinaria Archaeologica Natural Sciences in Archaeology
During the Early Iron Age in Europe (EIA), the phenomenon of the Hallstatt culture enveloped a large portion of the European continent. Between the Atlantic Ocean and the River Danube, cultural groups can be roughly divided into two major regions: the Western and the Eastern Hallstatt circle. EIA finds made from organic material decorated with pigments are usually well-preserved only in specific conditions. A good example is the coloured textile found in the salt mines of the eponymous site Hallstatt (AT). Other examples are Scythian finds north and east of the Black Sea, far outside the Hallstatt culture area. This paper presents the results of the analysis of decorated artifacts made from bone or antlers from Jalžabet (NW Croatia). The artifacts were found in two princely burial mounds with incinerated remains: burial mound 1 (Gomila) and burial mound 2. The funerary monuments belong to the Eastern Hallstatt culture and date back to the middle of the 6th century BC, i.e., the end of the Ha D1 period. A group of scientists from Croatia and abroad performed several series of analyses on the selected bone or antler artifacts. The motifs on the artifacts were made by incisions and were filled with black pigment, and there are faint traces of red pigment on the surface. With the help of colourant analysis performed in Brussels and Zagreb (SEM-EDX, MRS, FT-IR), zooarchaeological taxonomic identification, and archaeological determination of a selected group of findings from Jalžabet, we have tried to answer several major questions. The most important question being: are the traces of pigments on artefacts deliberate decoration? If so, can we determine the composition of the paint? What kind of raw materials were used for the production of the artifacts? These questions are important because these kinds of EIA finds are rare and even more rarely analysed. New data would considerably expand our knowledge about the funeral rites of the most prominent members of the Hallstatt nobility in the Drava River valley and Central Europe. Taxonomically, the raw material from which the finds were made was identified to be antlers, probably from red deer (Cervus elaphus). Using methods for colourant analysis, we have successfully proven deliberate application of black paint based on carbon black as a pigment, probably in combination with terpenoid resin. Until now, this composition was only known from much later, Roman-period finds. Also, it was confirmed that the black paint on the artifacts from both burial mounds in Jalžabet is of the same composition. The red pigment on the finds has been identified as hematite. It is highly probable that the red surfaces were deliberate, painted decoration. The probability of extracting the raw material needed for the production of the red paint in the Jalžabet micro-region was also established and requires further research (bog iron ore). The archaeological analysis of the finds supports the idea of the use of various types of decorated plates as inlays, probably on furniture or other luxury everyday items. Smaller finds could have been used as utilitarian objects, parts of attire, and jewellery.
- Research Article
106
- 10.1046/j.1365-2141.1999.01511.x
- Aug 1, 1999
- British Journal of Haematology
Despite a plethora of papers, reports and consensus statements during the last 25 years concerning the high prevalence and complications of iron deficiency (ID), the problem is still with us. A recent national study in Britain showed that 12% of 2-year-olds were anaemic, rising to 29% in Asian immigrant groups (Lawson et al, 1998). What can be done about it? This paper reviews detection and prevention of iron deficiency anaemia (IDA) referring mainly to studies published in the last 5 years. Earlier substantial reviews of iron nutrition are available (Brock et al, 1994; British Nutrition Foundation, 1995; Hallberg & Asp, 1996). Reviews specifically related to children include those by Oski (1993) and Wharton (1999a). Detection is divided into indications for investigation, which investigations to apply, and their interpretation. There are obvious indications for determining iron status in some clinical presentations. An example is suspected malabsorption. One study found that an oral iron absorption test was more sensitive as a screening test for upper intestinal absorption than the commonly used D-xylose method (Stahlberg et al, 1991; De Vizia et al, 1992). Other deficiencies often coexist with ID partly because a poor diet may have many deficiencies but also because of micronutrient interaction, e.g. ID with deficiencies of vitamin A or D (Gujral & Gopaldesas, 1995; Underwood & Arthur, 1996; Wharton, 1999b). ID may play a role in or complicate such diverse disorders as ischaemic stroke, apparent asthma, cyanotic heart disease and gastric trichobezoar (Hartfield et al, 1997; Hetzel & Losek, 1998; Olcay et al, 1996; Phillips et al, 1998). In my view any child reaching hospital as an outpatient or inpatient should have their haemoglobin level and RBC indices determined. Some specialities argue this is unnecessary since the haemoglobin distribution in their patients is no different to that in the general population (e.g. in otolaryngology; Heaton et al, 1991). This seems a lost occasion for opportunistic screening for a common disorder. Clinical signs are helpful only in severe anaemia, but surveys show that pale conjunctivae (sensitivity 74%) and nail beds (specificity 96%) are useful signs (Thaver & Baig, 1994). Blue sclerae might be an extra sign (Beghetti et al, 1993). The highest prevalence of iron deficiency anaemia (IDA) occurs in toddlers and adolescents because the increment in haemoglobin iron per unit body weight is greatest at these ages (see Fig 1). . Changes in body iron during development: (a) total body iron and haemoglobin iron (mg) in males except where shown; (b) daily increment in body iron (mg/d): —-, male; - - - -, female; ...., female plus menstrual loss; (c) proportional daily increment in body iron (μg/kg/d); symbols as for (b). Two points should be noted. There is little increase in total body iron in the first 4 months or so of life. As the haemoglobin falls from around 18 g/dl at birth to 14 g/dl during the first 2 weeks of life the liberated iron is stored and then gradually reused as the total mass of circulating haemoglobin begins to increase with growth. Between 4 and 12 months total body iron increases by about 130 mg, and an external source of iron is necessary. If not met, ID occurs and frank anaemia develops usually after the first birthday. Note also that boys need more iron at adolescence because of the increase of muscle and myoglobin. Subsequently these increased requirements due to changes in body composition subside, but increased requirements continue in girls following menarche. Infants who continue to receive only breast milk after the first 6 months of life are at increased risk. Breast feeding may continue after 6 months without difficulty so long as other foods providing available iron are introduced. Also at risk are infants who, despite current policy, are changed from an infant formula to whole cows' milk before the age of 1 year. It is not clear whether the higher prevalence of IDA in these infants is mainly the effect of an inadequate intake of dietary iron or due in addition to increased intestinal iron loss (Ziegler et al, 1990; Fuchs et al, 1993a, b). In toddlers attending well child facilities in Cleveland, U.S.A., a simple dietary history predicted microcytic anaemia (sensitivity 71%, specificity 79%), but a quarter of the anaemic children were not identified (Boultry & Needlman, 1996). A community study in Sydney found a low consumption of meat (i.e. haem iron) and introduction of whole cows' milk before the first birthday were significant indicators of ID (Mira et al, 1996). Many adolescent girls try to control their weight and inadvertently limit iron intake. This was particularly marked 10 years ago in British girls who bought snacks from local shops rather than eating school lunch or food from home, but there has been evidence of improvement since then (Department of Health, 1989; Moynihan et al, 1994; Southon et al, 1994; Doyle et al, 1994). Many adolescents pass through a temporary period of vegetarianism because of concerns with animal welfare. Although adequate iron nutrition is achievable on a vegetarian diet it must provide iron sources (such as pulses) and enhancers of absorption (e.g. vitamin C and fish or poultry if acceptable), and the temporary amateur vegetarianism may not ensure sufficient absorbed iron. Preterm babies are born with a lower concentration of haemoglobin, so any physiological haemolysis liberates less iron for stores; erythropoietin, if given, increases iron requirements, and so does catch up growth. Light for gestational age babies often have a raised haemoglobin at birth reflecting intrauterine hypoxia, and so initially, post haemolysis iron stores are higher but the rapid catch-up growth increases demands. In a normal term baby the total haemoglobin mass doubles during the first year of life (from 180 mg at birth to 340 mg at 1 year). In a preterm 1 kg baby the increase is 6-fold (50–300 mg). In a 2 kg baby born at term the increase is 3-fold (110–330 mg). Children of immigrants or refugees have a higher prevalence of iron deficiency, presumably due to such factors as socio-economic deprivation (living in inner city areas with overcrowding and limited parental income), language difficulties (health education is difficult), unfamiliarity with foods available in the new environment (often a tendency to rely on milk and puddings), food customs which are difficult to follow (e.g. halal meat for Muslims may not be easily available and so children are given a meat-free diet by a mother inexperienced in providing a balanced vegetarian diet). In a nationally representative survey of British 1.5–2.5-year-olds 12% had IDA but among children of immigrant families it was higher: India (20%), Pakistan (27%) and Bangladesh (29%) (Lawson et al, 1998). Other recent reports describe the problem in children from South-East Asia, Latin America and Eastern Europe living in U.S.A., Norway and Switzerland (Graham et al, 1997; Sargent et al, 1996). Athletic performance, particularly endurance sport, may lead to blood loss from the gut and urinary tract (Robertsson et al, 1987; Haymes & Lamanca, 1989). Therefore athletic girls who have passed menarche and are trying to slim may be at particular risk of iron deficiency. The staging of iron status by Oski et al (1983) is a useful concept and various measurements can be used to define the stages. Iron stores and erythropoiesis normal. Erythropoiesis normal but iron stores reduced (serum ferritin <12 μg/l) indicating a reduction of iron in the bone marrow, liver and other parts of the reticuloendothelial system (note that the exact cut-off point for normal/abnormal ferritin depends on the method used; a reference ferritin preparation to calibrate the assay is recommended). (i) Abnormal RBC biochemistry (free erythrocyte protoporphyrin (EPP) >99 mmol/mol haem; serum transferrin receptor raised, e.g. >8.5 mg/l but exact cut-off depends on age and the assay used); (ii) abnormal RBC morphology (microcytosis, MCV <80 fl, varying with age; anisocytosis, RDW > 15%); (iii) transport iron reduced (transferrin saturation <10%). The above plus haemoglobin <11 g/dl. There is no evidence that iron depletion or iron-deficient erythropoiesis alone have any adverse clinical effects, whereas iron deficiency anaemia is associated with alterations of immunological, gut and mental function. In the recent NHANES survey in the U.S.A. (Dallman et al, 1996) ID was defined as the presence of two or more abnormal measurements as shown in Table I. Although one can argue about the exact cut-off points used and the need for two abnormal characteristics, this battery of tests was applied to a large number of children and the haemoglobin ranges produced (i.e. after excluding children with more than one abnormal test) are suitable reference standards (see Table I). A recent survey in Bristol has suggested a cut-off point for haemoglobin concentration in 12- and 18-month-olds as low as 10 g/dl, but no attempt was made to exclude iron-deficient children and the haemoglobin method used was the Hemocue B-Hb photometer (Sherriff et al, 1999). It would be impractical, however, to use initially the whole battery of investigations shown in Table I and simpler approaches for population and individual studies have been suggested. Yip et al (1996) suggested that haemoglobin concentrations alone could be used in populations. The distributions of haemoglobin are determined in children and adults. If the distribution is moved to the left, in both children and women of child bearing age but not in men, then iron deficiency is likely. If the distribution is moved to the left in men as well, then probably other factors are operating as well, e.g. malaria or hookworm. This approach has been used to diagnose dietary ID in Pakistan, iron losses from hookworm in Zanzibar, and iron losses (from undetermined causes) in Alaskan natives (Petersen et al, 1996). The strategy has been questioned in Thailand for children >5 years of age in whom anaemia was rarely associated with a low plasma ferritin (Linpisarn et al, 1996). Electronic counters based on impedence or light scattering are in common use in developed countries. The likelihood of iron deficiency may then be assessed from the indices. Increasing use is now made of histogram distributions of red blood cell volume rather than using only arithmetic summaries of size and variation in size such as mean corpuscular volume (MCV) and red cell distribution width (RDW). With some methods 'red cell cytograms' are also available in which red cell volume is plotted against red cell haemoglobin concentration for all red cells counted. Walters & Abelson (1996) have described the interpretation of full blood count and indices, possible artefacts (due to cold agglutinins, high white cell counts, and hyperosmolar plasma), crude checks for internal consistency (haemoglobin in g/dl about 3 × RBC; calculated and correct PCV about 3 × haemoglobin) and its interpretation in children. Hinchcliffe & Helliwell (1993) have described the use of distribution histograms and cell cytograms in children. Typically in iron deficiency anaemia Hb and MCV are reduced, RDW is increased (i.e. microcytosis and anisocytosis), red cell haemoglobin distribution width (HDW) is increased (i.e. anisochromia), and the 'shape' of the cell cytogram scatter is moved down and to the left with a large proportion of cells in the hypochromic microcytic zone. During a response to iron treatment double peaks are seen in the histograms for red cell volume and red cell haemoglobin and the cytogram shows more cells in the normocytic normochromic zone. The application of these more sophisticated methods to population screening has not been evaluated. Mates et al (1995) argued that a full electronic counter blood screen is a 'watchdog of community health'. In their Israeli series including adults as well as children, 1% had microcytosis, mostly due to iron deficiency (58%) and thalassaemia minor (35%). Similarly Kim et al (1996) recommend that MCV and RDW be routinely determined, and this increases the predictive value for iron deficiency to 98%. Choi & Reid (1998) found RDW a useful predictor of red cell disease in the well baby clinic. EPP alone (or zinc protoporphyrin, ZPP) has been used for screening and as an indication for a therapeutic trial of iron in some American paediatric practices (Benjamin et al, 1991; Siegal & Lagrone, 1994). In iron deficiency zinc fills the iron pocket in the protoporphyrin molecule. ZPP determination requires only 20 μl of blood and is easily measured in a haematofluorimeter. It also remains abnormal for a week or so, even if iron therapy commenced before the test. However, it is also abnormal in the anaemias of inflammation and in lead poisoning. Serum ferritin may also be determined on small blood samples, but careful standardization of methods and use of a reference ferritin preparation for calibration are necessary (Worwood, 1997). It is raised during acute infections, chronic disease and liver disease irrespective of the iron stores, but iron deficiency is the only cause of a low concentration. Serum transferrin receptor concentration has raised considerable interest. The concentration reflects the number of transferrin receptors on immature red cells and so in most instances also reflects the rate of bone marrow erythropoiesis. Iron deficiency, however, also results in an 'unproportional' increase in the concentration (Heubers et al, 1990). An increased concentration provides an early and sensitive indicator of functional iron deficiency, sometimes before the plasma ferritin has fallen (Skikne et al, 1990; Worwood, 1995, 1997). A major advantage is that it remains normal in many chronic disorders if iron deficiency is not present. However, it is raised in the thalassaemias even though iron deficiency is not present. As in adults, in infants and 11–12-year-old boys higher concentrations of the receptor were associated with a lower serum ferritin even within the normal physiological range for ferritin (Virtanen et al, 1999). However, its use as an index of iron deficiency in infancy and adolescence has been questioned (Kuiper-Kramer et al, 1998; Kling et al, 1998; Kivivuori et al, 1993; Kuizon et al, 1996). It would be unwise to use the test alone without other measurements of ID as well. Hereditary causes of microcytosis, inflammation and various chronic diseases, and occasionally lead poisoning, may cause difficulties of interpretation. Apart from the thalassaemias, hereditary causes of microcytosis are quite rare. Most are associated with iron overload of tissues, but a small number of children have been described with ID because of a defect in absorption (Table II). The RBC in IDA and the thalassaemias have similar indices. The degree of anisocytosis and hence the RDW is usually higher in IDA, particularly in relation to the degree of microcytosis. Various mathematical ratios of red cell indices have been suggested to help the differentiation. Using cytometry plots, the proportion (%) of hypochromic cells is greater than the proportion of microcytic cells in iron deficiency, whereas the reverse is true in thalassaemia (d'Onofrio et al, 1992). In thalassaemia there also may be an increase in hypochromic macrocytes. When there is any possibility of a thalassaemia, however, it is usually better to proceed directly to haemoglobin electrophoresis and A2 determination, but iron deficiency in association with thalassaemia may temporarily mask the characteristic changes in A2 and HbF. In a United States study a half of people (aged 15–49 years) with beta thalassaemia trait, had a raised ZPP, and so did a quarter of those with haemoglobin E or alpha-thalassaemia trait, suggesting that ZPP may be abnormal in thalassaemia traits, but the exact iron status of the subjects was not defined (Graham et al, 1996). The anaemias of infection and chronic diseases are classically normochromic and normocytic, but hypochromia and microcytosis occur in about a third of infected children. Even after a mild infection many measurements move in the same direction as occurs in ID, but ferritin rises and serum transferrin receptor remains normal (see reviews by Abshire, 1996, and Walter et al, 1997). On the other hand, tropical infection such as malaria did not interfere with the use of EPP and ferritin in the diagnosis if ID of Zanzibari schoolchildren (Stoltzfus et al, 1997). In chronic disease such as rheumatoid arthritis interpretation is difficult. Although measurements suggesting ID may be due to cytokine activity true ID by bone marrow may also serum transferrin receptor is raised in a number of suggesting iron erythropoiesis which may to iron et al, 1996). The anaemia is classically normocytic and but ID is often as well, the RBC ZPP is raised, not because of iron to protoporphyrin to haem but because which the is by There is evidence iron deficiency with an increased risk of lead et al, 1995; et al, 1996). Some years ago anaemia with would for both but as lead has the diagnosis is less If in the blood lead should be determined and the response to iron noted. and reduction in blood lead following treatment are less in iron-deficient children et al, 1996; et al, 1997). is a of and the early before If at the age of 18 many children would have been anaemic for some If e.g. months at the of a number of children not anaemic then so some months In Bristol a quarter of children found to be anaemic at the age of 2 years had not been so at months et al, 1993). An age for screening is not children in particular groups might be if the prevalence of IDA is high in the e.g. in inner city children of immigrant or and toddlers in whom cows' milk was the before 12 months of Various have suggested e.g. the for (1998) in of in such as the British and Nutrition many individual and & 1996). loss should be with which as the usually The of may In infants in whom the was not had had a higher at 2 months of age; this had no effect on serum ferritin at 3 months of age in children et al, 1997; et al, 1997). of iron at birth is in haemoglobin, blood loss is a cause of anaemia in early and the other stores in show little to the iron some studies in both the developed and have shown Two have shown that iron status in is associated with a iron status in the infants at 1 year of age 1996; et al, 1990). This could a effect of reduced iron stores or that both mother and child have an iron-deficient babies there is little because total body iron does not increase during this If ID occurs then abnormal blood loss should be This may occur in the period or (e.g. from gastric in a rarely of Breast feeding is or that a infant formula is the age 6 months a dietary source of iron is necessary. babies this is easily by use of an infant formula which is iron or introduction of a all of which are iron Breast milk alone not the extra iron but absorption of the small of iron is This is less in babies formula or a and babies to receive foods from an age than et al, 1990). Although there is evidence that early an introduction of foods with iron absorption from breast milk et al, should be from 6 months of because of its haem is an providing zinc as well, which may also a in breast A study showed that an intake of of meat a from the age of months to an intake of 10 to lower falls of haemoglobin in there were no on serum ferritin or transferrin receptor concentrations et al, 1998). many who continue to breast their infants vegetarian foods from which iron is less foods are available in the and some are with iron. use of these foods a more diet less and than a diet alone & et al, in questioned the possibility of iron to children from about 4 months rather than the risk of which have low iron et al, 1998). In Britain of the total iron intake of children years is by as haem by such as and and This diet the for most ages but not for toddlers age years intake was of is for girls of (Department of Health, 1989; et al, However, the total intake is only of the The absorption of iron is determined by the composition of the the of the and of haem iron increases anaemia is but is little by other of the of iron is by vitamin other in and such as and and animal is by and reviews specifically for children, and The of iron absorption is also by body stores, the rate of and The receive from these factors to absorption are not In foods are less available at and and are higher et al, 1998). On the other hand, some foods are in iron to better iron status than if are used & 1998). is used as a source of iron in et al, 1997). The role of cows' milk in intestinal blood loss has been to In many parts of the hookworm is the most common cause of blood There are control at control of of use of simple for schoolchildren and should be applied to children as well, with iron (Stoltzfus et al, 1998; et al, 1997). is less as a cause of anaemia, and have been as less causes of iron deficiency as a population problem et al, 1996). With a clear strategy should be a simple of is but of eating customs and is difficult. education have been e.g. in a in Bristol prevalence of microcytic anaemia at months from to during a but because it had to a 2 years et al, 1993). have not been In a reaching about children in a of about of children in both the and control groups were anaemic at 18 months of age et al, 1997). of suitable foods may be if their to the is reduced of by from a or a The example is the and Infants and Children in the U.S.A. infant and are of to about a quarter of all the was the prevalence of iron deficiency anaemia has fallen and is less than in many other (e.g. in U.S.A. the prevalence of IDA among is Britain et al, 1997; et al, A recent showed that those within the had less anaemia and a better iron status than those who were not & 1997). The British to families on or the mother to an infant formula of which are in or whole cows' milk which little iron. are not are Many other have but many are based on milk for and does little to iron The in iron of infant is Most in the U.S.A. about 12 mg/l and most in Europe up to mg/l In both without iron are by the current and in are around 4 mg There are to little or no iron to in the first months of life 1996). (a) body iron increases little during this breast milk only small of and iron may have adverse on the & Wharton, 1991; et al, (b) With higher the absorbed is only a little more iron in the gut (c) infants an infant formula with no or small of iron in the first 4 months of life not ID et al, 1993; & 1996). all infant used at this age iron. about 6 months of age more dietary iron and a of a intake The usually a formula but the level of is e.g. use of a infant formula or an American infant formula or introduction of a of these is to the early introduction of cows' is that even in in infancy a level of lower than used can be in adequate absorption iron e.g. mg et al, 3 mg et al, 2 mg et al, but the of were for months only and did not follow infants into the year of life anaemia is most Using iron infant in adults, the suggested of mg/l to provide 1 mg of absorbed iron et al, 1998). A is which of an infant formula iron Many studies have shown the effect of using of cows' milk on iron status in infants months and in toddlers in the year of life. It is not however, to this reflects the effect of a greater intake of or is due also to of an formula such as the greater absorption of iron because of the higher vitamin C less of absorption because of the lower concentrations of and one study found that the addition of did not iron et al, or less milk and iron both the iron and the are British studies of the use of or cows' milk from the age of 6 months that ID was in those an iron
- Book Chapter
- 10.33547/swibie2022.2.14
- Dec 31, 2022
Seven amber beads (Fig. 14.1; Table 14.1) were discovered in five graves (0.9% of all graves) at the Świbie cemetery. They differ in the overall shape of the body. Following the typology proposed by M. Chytráček et al. (2017) for amber beads from HaC–D1 found in the Czech Republic, Moravia and Slovakia, one formal type (A2) can be distinguished in the analysed material, and it is represented by three variants (A2a, A2c, A2d); in addition, one specimen does not fit into the classification of M. Chytráček et al. (2017). All variants mentioned are long-lived, found before and after the Hallstatt period. In the Early Iron Age, they are known from present-day Poland, Bohemia, Moravia, and Slovakia, as well as from lands further south-west, such as Italy, Croatia and Slovenia. In central Europe, amber artefacts of HaC–D1 date are found most abundantly in the areas of Greater Poland (Fig. 14.2), central Silesia, Bohemia, and Moravia. Upper Silesia and western Lesser Poland are very poor in finds of this type (Figs 14.3 and 14.4). The paucity of amber finds in the Częstochowa-Gliwice area cannot be explained by the cremation of the deceased together with their furnishings, as biritual rites prevailed there during the Hallstatt period and inhumation burials predominated over cremations. Other factors (possibly fashion or customs) that influenced the rare furnishing of the deceased with amber ornaments must therefore be taken into account.
- Front Matter
4
- 10.1111/j.1365-2796.1989.tb01396.x
- Nov 1, 1989
- Journal of internal medicine
Iron in clinical medicine--an update.
- Research Article
2
- 10.1017/s0033822200030721
- Jan 1, 1995
- Radiocarbon
During the 1989–1994 renovation of the Zagreb Town Museum, it became obvious that the area was inhabited in prehistoric times. We 14C dated 40 samples to determine various settlement periods. The ages of the samples span a much longer time than expected, from the Early Iron Age (Hallstatt period) to the 19th century ad. 14C dates on charcoal samples placed the remains of dwelling pits in the Hallstatt period, 8th to 4th century bc. A late La Tène settlement dated between the 4th century bc and the 2nd century ad. Medieval fortifications were identified in the western part of the complex, consisting of a well-preserved wooden structure used for construction of the royal castrum. 14C measurements on wooden planks and posts date the construction of the fortification between the 13th and 15th centuries ad and branches, beams, and tools found below the basement of the Convent of St. Clare span the 16th to the 19th century ad.
- Research Article
1
- 10.14746/fpp.2022.27.06
- Dec 29, 2022
- Folia Praehistorica Posnaniensia
The article contrasts two chronologically distinct groups of artifacts: painted ceramics from the Hallstatt period and the so-called white ceramics, produced until the end of modernity. They are related by means of the technique of covering a bright surface with colorful patterns and the stylistic similarity of certain geometric motifs. However, the ideas behind creating these pictorial representations were completely different. In the article, painted vessels from the Hallstatt period and modernity will be the starting point for detailed studies on magical and rational thinking about the world. It was in the Renaissance that, according to the concept of the sociologist and philosopher Max Weber (1864‒1920), a “disenchantment of the world”, took place ‒ e.g. the departure from the magical understanding of reality. Early Iron Age and Modernity ceramics will illustrate this process.
- Discussion
- 10.1016/s0140-6736(23)00448-8
- May 1, 2023
- The Lancet
Unanswered questions from the IRONMAN trial – Authors' reply
- Abstract
- 10.1182/blood-2022-169942
- Nov 15, 2022
- Blood
Exploration of Menarche, Dietary Iron, and Additional Risk Factors on the Development of Iron Deficiency in Adolescent Girls
- Research Article
235
- 10.1111/obr.12323
- Sep 23, 2015
- Obesity Reviews
Hypoferraemia (i.e. iron deficiency) was initially reported among obese individuals several decades ago; however, whether obesity and iron deficiency are correlated remains unclear. Here, we evaluated the putative association between obesity and iron deficiency by assessing the concentration of haematological iron markers and the risks associated with iron deficiency in both obese (including overweight) subjects and non-overweight participants. We performed a systematic search in the databases PubMed and Embase for relevant research articles published through December 2014. A total of 26 cross-sectional and case-control studies were analysed, comprising 13,393 overweight/obese individuals and 26,621 non-overweight participants. Weighted or standardized mean differences of blood iron markers and odds ratio (OR) of iron deficiency were compared between the overweight/obese participants and the non-overweight participants using a random-effects model. Compared with the non-overweight participants, the overweight/obese participants had lower serum iron concentrations (weighted mean difference [WMD]: -8.37 μg dL(-1) ; 95% confidence interval [CI]: -11.38 to -5.36 μg dL(-1) ) and lower transferrin saturation percentages (WMD: 2.34%, 95% CI: -3.29% to -1.40%). Consistent with this finding, the overweight/obese participants had a significantly increased risk of iron deficiency (OR: 1.31; 95% CI: 1.01-1.68). Moreover, subgroup analyses revealed that the method used to diagnose iron deficiency can have a critical effect on the results of the association test; specifically, we found a significant correlation between iron deficiency and obesity in studies without a ferritin-based diagnosis, but not in studies that used a ferritin-based diagnosis. Based upon these findings, we concluded that obesity is significantly associated with iron deficiency, and we recommend early monitoring and treatment of iron deficiency in overweight and obese individuals. Future longitudinal studies will help to test whether causal relationship exists between obesity and iron deficiency.
- Front Matter
18
- 10.1053/j.gastro.2015.08.003
- Aug 17, 2015
- Gastroenterology
Pathogens, Metabolic Adaptation, and Human Diseases—An Iron-Thrifty Genetic Model
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.