Abstract

Human ancestral diets changed substantially approximately four to five million years ago with major climatic changes creating open grassland environments. We developed a larger brain balanced by a smaller, simpler gastrointestinal tract requiring higher-quality foods based around meat protein and fat. Anthropological evidence from cranio-dental features and fossil stable isotope analysis indicates a growing reliance on meat consumption during human evolution. Study of hunter-gatherer societies in recent times shows an extreme reliance on hunted and fished animal foods for survival. Optimal foraging theory shows that wild plant foods in general give an inadequate energy return for survival, whereas the top-ranking food items for energy return are large hunted animals. Numerous evolutionary adaptations in humans indicate high reliance on meat consumption, including poor taurine production, lack of ability to chain elongate plant fatty acids and the co-evolution of parasites related to dietary meat. Anthropologists have long recognised that the diets of palaeolithic and recent hunter-gatherers (HGs) represent a reference standard for modern human nutrition and a model for defence against certain Western-lifestyle diseases. Boyd Eaton of Emory University (Atlanta) put this succinctly: ‘We are the heirs of inherited characteristics accrued over millions of years, the vast majority of our biochemistry and physiology are tuned to life conditions that existed prior to the advent of agriculture. Genetically our bodies are virtually the same as they were at the end of the palaeolithic period. The appearance of agriculture some 10,000 years ago and the Industrial Revolution some 200 years ago introduced new dietary pressures for which no adaptation has been possible in such a short time span. Thus an inevitable discordance exists between our dietary intake and that which our genes are suited to’. This discordance hypothesis postulated by Eaton could explain many of the chronic ‘diseases of civilisation’.1 This review presents an anthropological perspective on what HG populations may have actually eaten. Contrary to views that humans evolved largely as a herbivorous animal in a ‘garden of Eden’ type of environment, historical evidence indicates a very different reality, at least in the last four to five million years of evolutionary adaptation. It was in this time frame that the ancestral hominid line emerged from the receding forests to become bipedal, open grassland dwellers. This was likely accompanied by dietary changes and subsequent physiological and metabolic adaptations. The evolutionary pressure for some primates to undergo this habitat and subsequent diet change involving open grassland, foraging/scavenging, related directly to massive changes in global climatic conditions, primarily drier conditions followed by worldwide expansion of the biomass of temperate climate (C4) grasses at the expense of wetland forests,2 accompanied by a worldwide faunal change,3 including the spread of large grazing animals. Thus, the foods available to human ancestors in an open grassland environment were very different from those of the jungle/forest habitats that were home for many millions of years. The lines of investigation used by anthropologists to deduce the evolutionary diet of our evolving hominid ancestors are numerous: (i) changes in cranio-dental features; (ii) fossil isotopic chemical tracer methods; (iii) comparative gutmorphology of modern humans and other mammals; (iv) the energetic requirements of developing a large ratio of brain to body size; (v) optimal foraging theory; (vi) dietary patterns of surviving HG societies; and (vii) specific diet-related adaptations. Findings from each of these fields reveal a changing dietary pattern away from low-quality/highly fibrous, energy-poor plant stables to a growing dependence on more energy-rich animal foods, culminating in palaeolithic Homo sapiens being top-level carnivores.4 Early hominid fossil remains already show clear cranio-dental changes which indicate a move away from a specialised structure suited to coarse foliage mastication to a more generalised structure indicative of dependence on fruits and hard nuts but also incorporating changes that indicate meat consumption. Such changes included a decrease in molar teeth size, jaws/skull became more gracile, front teeth became well buttressed and shearing crests appearing on teeth, all indicative of less emphasis on grinding and more on biting and tearing of animal flesh.5 The C13/C12 isotope ratio in fossil remains is indicative of diet, and is a particularly good marker of the intake of broad-leaf plant material versus grasses. Basically, trees, bushes and shrubs use the C3 photosynthetic pathway, which discriminates against the heavier carbon isotope C13 during fixation of CO2 compared with the grasses, which use the C4 or Hatch-Slack photosynthetic pathway.6, 7 Examination of early hominid remains indicates that they ate large quantities of the C13-enriched foods.8 As hominids in general have shown no capacity for digestion of grasses or teeth microwear patterns indicative of grass mastication,9, 10 these hominids were obviously consuming grazing animals existing on the C4 grasses. Similarly, the bone Sr/Ca ratio in mammals shows an inverse correlation with trophic level, with pure carnivores showing the lowest ratio. This pattern is paralleled in fossil remains of palaeolithic fauna, with early hominids showing a Sr/Ca ratio midway between contemporary carnivore and herbivore species.11, 12 These results alone would indicate that even very early hominids consumed a considerable proportion of meat in their diet.13 Another line of investigation which is useful in ascertaining the dietary preferences and suitability of a species to certain food types is to study the structural features of the gastrointestinal tract. Both pure herbivores (folivores and frugivores) and pure carnivores (such as felids) have physiological and metabolic adaptations suited to their diet.14, 15 Humans fit neither category, but are truly omnivores, falling between the largely frugivorous make-up of such anthropoid relatives as the chimpanzee and the adaptations of the true carnivores.16 A sacculated stomach or well-developed caecum and colon are associated with plant-based diets. The lower the plant quality (or the higher the fibre content), the more pronounced are these features. The ruminant animals (foregut folivores) show the greatest volume in the stomach region. Non-ruminant herbivores (midgut folivores), such as the horse, have greatest development in the caecum and colon. Measures of relationship between gastrointestinal length or surface area, and body length or surface area give a good relative comparison of carnivore versus herbivore characteristics (Table 1). Carnivores tend to have a well-developed acid stomach and long small intestine. The human gut with its simple stomach, relatively elongated small intestine and reduced caecum and colon, does not fit any one group but lies between the frugivore and faunivore groups, suggestive of reliance on a high-quality diet in which meat is a predominant component. The size of the human gut relative to body size is also small in comparison with other anthropoids, with a much more pronounced small intestine similar to carnivores.14, 16 Approximate relative proportions of gut volume for humans and some other primates are shown in Table 2. Primates in general, and humans in particular, have larger brain sizes than would be expected for their body size (predicted by the Martin equation18), a phenomenon described as ‘encephalisation’.19, 20 Since the time of Australopithecus afarensis, some four to five million years ago, brain size has increased threefold (Figure 1). What the driving force was for this dramatic increase can only be speculated, although many sound hypotheses based on socio-ecological factors have been put forward. Brain size enlargement in the human hominid line during the past four million years. Adapted from Henneberg.21 Irrespective of the driving force for encephalisation, two critical requirements had to be met: (i) the brain's chemical requirement for long-chain polyunsaturated fatty acid (PUFA), particularly arachidonic acid (20:4n-6) and docosahexaenoic acid (22:6n-3), both of which can only be obtained from animal tissue;22, 23 and (ii) the increased metabolic requirements of a larger brain.24, 25 To sustain such a metabolically expensive large brain, there are two possible evolutionary adaptations that could be used: either elevate the basal metabolic rate (BMR); or compensate for higher brain energy with lower mass-specific metabolic rates of other tissues. The BMR of eutherian mammals is accurately predicted by the Klieber equation, based on body mass,26 and humans fit this predictive value well, indicating no increase in basal metabolism. However, when examining individual organs, the brain mass surplus (and energy requirement) is closely balanced by the reduction in size (and energy requirement) of the gastrointestinal tract27 (Table 3). The gut is the only organ which can vary in size sufficiently to offset the metabolic cost of the larger brain. Diets high in bulky food of low digestibility require relatively enlarged gut size with voluminous fermenting chambers (rumen and caecum). Diets consisting of high-quality foods are associated with relatively small gut size, with simple stomachs, reduced colon size, but proportionately long small intestine28 as seen in carnivores. With the relatively poor macronutrient density of wild plant foods, particularly in the open grassland areas, the obvious solution for our ancestors was to include increasingly large amounts of animal-derived food in the diet.5 The increasing consumption of meat, rich in protein and fats (particularly unsaturated forms), would provide a basis for the threefold increase in human brain size in the last 4.5 million years, from the perspective of both energy supply25 and brain fatty acid substrate availability.23 Essentially the subsistence patterns of HGs, early hominids and our palaeolithic ancestors can be explained in terms of cost/benefit analysis. The major survival determinant is daily energy procurement (less energy expenditure). Various models have been developed to explain this phenomenon and loosely described as the ‘Theory of Optimal Foraging’.29 The wild fruits, vegetables, foliage and tuberous roots available to HGs and early hominids were generally fibrous and of low energy density.30 The high energy/time spent in collection and preparation of such plant foods, particularly seed grains, is not well rewarded in terms of energy gain; hence these are not feasible as a primary energy source (Table 4). This explains why HGs generally have high meat intake despite abundant plant food availability.32 However, it should not be forgotten that these plants were and are a major source of fibre and micronutrients.33 Calculations have been made from Murdock's Ethnographic Atlas34 of 229 HG societies, showing that the majority of HG societies obtained ≥56–65% of their subsistence (energy) from animal foods (Figure 2). The predicted macronutrient energy intake ranges were carbohydrate 22–40%, protein 19–35% and fat 28–47%.35 Frequency distribution of subsistence dependence upon total (fished + hunted) animal foods in worldwide hunter-gatherer societies (n = 229). Frequency indicates number of societies at that percentage energy dependence on animal foods. Median = 56–65%, mode = 56–65%. Adapted from Cordain et al.35 Similar to obligate carnivores, humans have an inefficient ability to chain elongate plant-rich 18-carbon fatty acids into the 20- and 22-carbon PUFA essential for cell membrane function and brain tissue,36 hence requiring direct consumption from animal tissue. Likewise, humans have inherited a very decreased ability to synthesise taurine from precursor amino acids.37 The proposed rationale, as for obligate carnivores, is that there was reduced selective pressure to synthesise taurine in vivo because exogenous dietary sources of preformed taurine (found only in animal tissue) were being consumed for a lengthy time period. Physiologically haem and other porphyrine iron-rich compounds derived only from meat are absorbed by humans in preference to ionic forms of iron, whereas herbivorous animals cannot absorb these haem complexes and rely on absorption of ionic iron.38 Finally, mammalian hosts and their various parasites undergo close co-evolution. Cestodes of the family Taeniidae are parasites of carnivores spread by eating meat. Taenia saginata and T. solium use humans exclusively as their host, indicating a substantial period of co-evolution and meat consumption by humans and their ancestors.17 The dietary changes involved with the transition from hunting and gathering to agriculture have been extensively reviewed.39-41 This transition began in the Near East approximately 10 000 years ago with the growing of wild cereal crops as a response to population increase and/or scarcity of large mammalian wild game. The transition, however, was associated with physiological stresses, including reduced stature, osteomalacia, dental caries and various nutritional deficiencies, and infectious disease.42 The archaeological evidence indicates a shift from consumption of hoofed mammals (gazelle, antelope and deer), root plants, wild pulses, various nuts and fruit, to a more narrow diet of cultivated wheat, barley, oats, rice or corn, depending on location.43 This transition also corresponded with a fundamental reversal of the high-protein, low-carbohydrate diet of the previous HG societies, along with the shift in fatty acid intake type. With the shift away from the HG dietary pattern and the high reliance on meat in the diet to a more grain-rich diet, the dietary fat intake profile of humans changed significantly. The P : S dietary ratio has dropped drastically from 1.4:1 to 0.4:1,1 and the n-6 : omega-3 ratio increased from approximately 3:1 to greater than 12:1 in the current Western diet, where the n-6 PUFA from seed oils are now abundant in the diet.44 Similarly, many micronutrient intake levels likely dropped following the shift to agriculture and further subsequent developments in food processing and mass production during the industrial revolution some 200 years ago and, more recently, the ‘fast-food revolution’1 as indicated in Table 5. With the industrial revolution came more efficient milling methods which separated the fibre-rich bran and nutrient-dense germ of the various grains from the starch-rich endosperm. The range of new products expanded rapidly based on these refined grains and the flour made from them. Similarly, the mass production of nutrient-poor refined sugar cane became common.46 In more recent times, particularly the last 50–60 years beginning in the USA, we have been exposed to what has loosely been termed the ‘fast-food revolution’. This encompasses such aspects of modern Western diet as: (i) the proliferation of take-away food outlets with their ready-to-eat, well-advertised, fat- and energy-rich commodities, usually poor in micronutrients and often filled with high-glycaemic-index (GI) processed carbohydrates; and (ii) the broad range of processed packaged ready-to-heat and serve foods that occupy the expanding aisles of the local supermarket. These also are generally nutrient-poor and energy-rich, and come with little need for concomitant energy expenditure.46 The increased contribution of carbohydrate from grains to the human diet following the agricultural revolution has effectively diluted the protein content of the human diet. Whether current protein intakes are below the ideal is a question now being asked, especially in regard to effects on satiety and rates of obesity. Another issue to bear in mind is that ‘optimal foraging’, which shaped primitive diets, does not apply in the modern era. Today survival does not demand energy-dense, high-fat foods. In fact, it is clear that these foods need to play a reduced role in societies where dietary energy is available in abundance and concomitant energy expenditure is limited. The type of fat in modern diets is distinctly different from that eaten by our forebears. In particular, saturated fat and n-6 PUFA (from seed oils and grains) have increased at the expense of omega-3 fats from fish and red meat. The cardio-protective effects of long-chain omega-3 fats and their preferential incorporation into the tissues of the body support the notion that modern diets are deficient in these nutrients. Carbohydrates in the modern diet are also very different from those that HGs ate. Today the majority of dietary carbohydrates are derived from processed cereals and can be of high GI. Cereals were seldom eaten by HGs due to the small size of wild seeds and the degree of difficulty in collecting them. It has been argued that the replacement of protein from meat and fish with high-GI carbohydrates from starch and sugars may have implications for insulin resistance and the development of type 2 diabetes—a condition unknown in HG societies. There may be more certainty about the impact of reduced intakes of fish and meat on iron and zinc status. Humans evolved on dietary intakes of these minerals, which were several times the current intake.1 It should come as no surprise that these two nutrients are limiting in the diet of sections of the Australian population today who do not consume red meat.45 Likewise, the low vitamin B12 status of vegetarians testifies to past reliance on meat in the diet.47 Plant foods most likely provided the bulk of micronutrients and fibre in the ‘palaeolithic’ diet; thus, it should be realised that humans were not carnivores, but rather true omnivores, as revealed by numerous lines of investigation including gut morphology studies. The need for a wide variety of nutrient-dense vegetables and fruits and (low-GI) wholegrains in our diet is of importance. However, our pre-agricultural ancestors' basic energy, protein, long-chain fatty acids, vitamin B12, iron and zinc supply came from meat. Thus, adaptations to such a dietary pattern accumulated in our bodies over approximately three to four million years of relatively high meat intake and minimal grain intake. It is argued that the modern Western divergence from this dietary pattern forms the basis of lifestyle diseases that we now face. Thus, there is no historical or valid scientific argument to preclude lean meat from the human diet, and a substantial number of reasons to suggest it should be a central part of a well-balanced diet.48

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call