Abstract

Some 50 years ago, and befittingly at an AAPA dinner, Stanley Garn delivered the Presidential Address, which he titled Nutrition in Physical Anthropology. In his address, Garn predicted, “a concern with nutrition in its broadest sense, including the physical form of the dietary and the energy balance will occupy us more rather than less in the years to come, as we channel our interests to ascertain the directions of evolutionary and physiological change…” (Garn, 1966). He specifically mentioned the nutritional underpinning of our interests in the evolution of the genus Homo, the diets of nonhuman primates, child growth, and sexual maturation, secular changes in adult body size, and growing evidence that obesity was replacing malnutrition in some populations. In this essay, we will take up Garn's prediction and review the themes related to human nutrition that occupy us in 2018. We define nutrition broadly: the consumption of food and the effect of food energy and nutrients on body size, health, and function. Nutrition is an area of interest shared with nutritional scientists and public health professionals, but the anthropological lens provides a view that is more comparative and evolutionary. As biological anthropologists, we are interested in the nutrition of a wide range of contemporary populations, as well as that of historic and prehistoric populations and the hominins in our evolutionary past. We consider nutrition as a key environmental factor in shaping human variability and evolution, and use the principles of evolution to explain variation between populations in the present as well as the past. We also recognize the inseparable nature of human cultural and biological adaptation to the environment, and hence use the explanatory frameworks referred to as biocultural or biobehavioral to address the biological-cultural nexus in studying nutrition. Three thematic areas have been prominent in the research of biological anthropologists in the past decade: diet in the evolution of the genus Homo, diet and nutrition in the transition from hunting and gathering to agriculture, and diet and nutrition in the current era of globalization. These thematic areas are defined by the time-period of interest and questions asked, but each is viewed as a transition in diet with measurable outcomes in terms of morphology and/or health. In the Paleolithic, major questions revolve around changes in the dietary niche of Homo, specifically the availability of dietary energy that enabled significant changes in the morphology (body size and brain size) observed. In the Neolithic, it is clear that the transition from foraging wild plant and animal species to the production of food via agriculture and animal husbandry changed the dietary niche in very significant ways. The timing and pace of this transition varied across geographic regions with the earliest dates at about 10,000 BCE. Questions of interest are related to how changes in nutritional ecology impacted body morphology and health in different geographical regions. In the current era, transitions in diet are associated with processes of industrialization and globalization that result in increases in food availability. Almost all human populations find themselves at some stage of this newest nutrition transition. Questions of interest revolve around the timing and pace of the transition in different populations, and the linkages between the changes in diet and the changes in body size, composition, and risk of chronic disease. Here we will consider each of the three thematic areas in turn. First, we will highlight major points in their historical development. Second, drawing on the research of biological anthropologists in the past decade, we will provide illustrative examples of each of these themes. We note, however, that given the interdisciplinary nature of current research, many people are associated with more than one area of research. We will finish with some thoughts on emerging trends. Biological anthropologists have had a longstanding interest in the diets of our ancestors. The enduring questions that capture our imaginations are: What did our ancestors eat? And, how did that diet shape the evolution of the genus Homo? These are important but difficult questions to answer. Important because an animal's diet is a key component of its adaptation to the environment. Difficult because ancestral populations lived long ago and walked the earth over a wide geographical range. We may never be able to answer these questions with the precision desired. As the subtitle of Ungar's (2007) book Evolution of the Human Diet so eloquently suggests, there are things we know, things we do not know, and things that are unknowable. Research over the past decade has addressed many questions and added significantly to what we do know (Ungar, 2007). Especially impressive are the gains in understanding of the diets of early hominins from the use of stable isotopes and dental microwear analysis (Ungar & Sponheimer, 2011). Here we focus on the diets of the genus Homo and two closely interrelated themes that have appeared in the literature in the past decade: (1) the nutritional qualities of foods in ancestral diets, and (2) models of Paleolithic diets in terms of categories of foods and nutrients and their relevance to modern health concerns. Interest in the nutritional qualities of foods in ancestral diets, specifically digestibility and nutrient composition, is driven by the classic question of how to explain the rapid expansion of brain and body size in early Homo, which are characteristics associated with an increase in diet quality. One of the first approaches was based on the concept of relative diet quality among primates, especially as it related to encephalization. Milton (1987) argued that the emergence of Homo was associated with a higher quality diet than that found in the other hominins. She based her claim on comparative gut anatomy and digestive kinetics in hominoids, arguing that gut proportions in humans were unusual and a derived condition indicative of a higher quality diet (e.g., a diet lower in fiber and lignin). She speculated that the incorporation of animal protein and animal fat into the diets of early humans was a key element in enabling them to maintain a high-quality diet while increasing body size, sociality, and levels of physical activity. Interestingly, Milton's idea of the importance of protein has prevailed, while the suggestion that animal fat was also important has largely disappeared. Leonard and Robertson (1992, 1994) approached the question of diet quality specifically in terms of energy needed to support the larger bodies and brains of early Homo. They argued that early Homo would have had a higher quality diet than that of nonhuman primates, and that the increase in quality would have come from meat consumption. Aiello and Wheeler's (1995) paper “The expensive tissue hypothesis” was a milestone because it provided such a clear link between metabolic energy needs for encephalization and morphology. They reasoned that since brain tissue has a high metabolic rate, a highly encephalized animal should have a higher than expected basal metabolic rate (BMR) for its body size; and since humans do not, something else had to have changed. They posited that a reduction in the size (and hence metabolic cost) of the gastrointestinal (GI) tract compensated exactly for the increased metabolic costs associated with a larger brain, and further that size reduction in the GI tract was made possible by the adoption of a higher quality diet, i.e. a diet lower in bulk (fiber) and higher in digestibility. The question of cooking is also about the role of diet quality in supporting encephalization in early Homo. More specifically, it is about improving the digestibility of foods to make dietary energy more available. Although it has long been recognized that cooking is unique to our genus and hence of evolutionary significance, there was little more than a general interest in the topic until the past couple of decades. Stahl's (1984) paper on plant food selection by early Homo is a notable exception. She argued that a nonfire using hominin would need to focus on plant foods high in soluble carbohydrates and low in starch because certain raw starches are relatively indigestible by modern humans. Aiello's and Wheeler's (1995) proposal that cooking was a way to externalize part of the digestion process and make it less metabolically expensive is also an exception. However, the actual effect of cooking on dietary energy availability received relatively little consideration until the work of Wrangham and colleagues (Carmody & Wrangham, 2009; Wrangham & Conklin-Brittain, 2003) focused attention on evidence for the beneficial effects of cooking on the net energy value of foods in human diets. Wrangham (2007, 2009, 2017) has argued for the significance of cooking in improving the digestibility, and hence energy value, in the diets of early Homo, specifically Homo erectus. Some evidence in support of this idea comes from experiments with a mouse model demonstrating that consuming cooked (oven roasted) versus raw tubers (sweet potatoes), oil seeds (peanuts), and meat (beef) resulted in improved body weight maintenance (Carmody et al., 2011; Groopman et al., 2014). Mice are not primates, and the sample sizes were extremely small, but their findings agree with research on contemporary humans demonstrating that cooking generally increases the digestibility of starchy foods (Wang & Copeland, 2013) and some nuts (Grundy et al., 2015); the research on cooked versus raw meats is inconclusive. Interestingly, the extent to which cooking affects starch gelatinization and hence improves digestibility is dependent on the amount of water used in cooking (Wang & Copeland, 2013), and it is likely that early Homo used dry roasting techniques. In another contribution to the question of cooking, Schnorr et al. (2015) demonstrated that modern day Hadza hunter-gatherers use a brief roasting technique that does not consistently improve in vitro enzymatic digestibility. This unexpected result complicates the story and indicates research is needed on a wide range of potential cooking techniques, and a wide range of potential foods. While living humans are not a perfect model for early Homo in terms of either dentition or gut anatomy and function, in vivo experiments would be useful. Further, the argument that modern humans cannot survive on a raw food diet (Wrangham, 2009, and elsewhere) needs better empirical support in terms of well-constructed dietary trials. It seems obvious that cooking food became important at some point in our evolutionary history and signaled a significant shift in the use of food resources. The unresolved question is when this occurred and how the process developed. Although Wrangham (2017) has consistently associated cooking with the morphological changes in early Homo that we interpret as linked to a higher quality diet, the archaeological evidence for the regular use of fire postdates those changes. Perhaps regular fire use was not linked to morphological changes in early Homo, but to later changes that left few morphological traces, or perhaps cooking did not increase energy availability as much as proposed. Colleagues in nutrition also argue that a high-quality diet was necessary to support the evolution of the large brain in Homo erectus, but they go further by proposing that aquatic foods would have been a necessary dietary component. This argument has focused largely on the biological need for docosahexaenoic acid (DHA), a long-chain polyunsaturated fatty acid predominant in neurological tissue (Brenna & Carlson, 2014). Although modern humans and nonhuman primates can synthesize DHA endogenously from shorter chain fatty acid precursors, synthesis is slow. Given that preformed DHA is more abundant in the marine food chain than the terrestrial, Cunnane and Crawford (2014) argue that the availability of marine resources was critical to encephalization. This is an interesting argument that challenges the disciplinary tendency in anthropology to focus on terrestrial resources in reconstructing the diets of early Homo, and Tattersall (2014) has argued that it is time to broaden the traditional focus and consider the role aquatic foods may have played. Data are limited, but it is clear that the DHA content of aquatic foods varies by species and by type of aquatic ecosystem, as might be expected (Joordens et al., 2014). It is also clear that the DHA content of anthropoid breastmilk varies with diet, and is higher in both human and nonhuman primate mothers who consume aquatic foods (Milligan & Bazinet, 2008). However, optimal dietary intakes of DHA for adults and infants are not known. Further, although it is clear that achieving species-typical brain size does require minimum levels of DHA, it does not follow that greater availability of DHA will lead to greater brain size. The second theme focuses on models of the diet of pre-agricultural Homo sapiens and relevance of these models to modern health concerns. The first model of a Paleolithic diet and claim that it was relevant to modern health concerns was Eaton and Konner's (1985) paper, “Paleolithic nutrition: A consideration of its nature and current implications.” The article, published in a respected medical journal, The New England Journal of Medicine, generated considerable interest, and critique, in both medical (Leaf & Weber, 1987) and anthropological communities (Garn & Leonard, 1989). It was the first in a series of papers arguing that we are genetically adapted to the characteristics of a Paleolithic diet—a diet based on minimally processed wild resources as compared to industrially processed foods—and that the changes in diet and lifestyle that began with the development of agriculture and animal husbandry have negatively affected our health. This basic argument has had considerable appeal beyond biological anthropology as numerous publications in medical and nutritional journals, as well as the popular press attest. Konner and Eaton (2010) recently updated their first paper on the “Paleolithic diet” (Eaton & Konner, 1985). Their basic argument in this and other contributions (Eaton et al., 2010) has remained essentially the same: (1) we are genetically adapted to the characteristics of Paleolithic diets; (2) changes in diet and lifestyle that began with the development of agriculture and animal husbandry some 10,000 years ago were too recent for us to have adjusted to genetically, and (3) our failure to adapt to new dietary conditions is responsible for current problems of chronic disease. The Paleolithic diet originally described (Eaton & Konner, 1985) was based largely on the diets of extant or historically known hunter-gatherer populations (Kung San, Hadza, Australian Aborigines, Inuit and Tasaday). Unfortunately, our understanding of these diets is constrained by the semi-quantitative and qualitative nature of most of the dietary intake data, which leads to uncertainty in quantitative estimates. It is also constrained by the limited eco-geographic and seasonal variation represented by the populations used. Acknowledging these limitations, the Paleolithic model demonstrates the broad pattern of nutrient intake on a diet based on naturally occurring food resources, and this broad pattern appears to be quite different from a diet based on modern industrialized foods. Ulijaszek, Mann, and Elton (2012) take a broader, deeper view of the evolution of the human diet in their book Evolving Human Nutrition: Implications for Public Health. They argue that the “natural” human diet was shaped across millions of years of evolutionary time before the Paleolithic period, and there is not, nor has there ever been, a “set” human diet. Rather, they argue that human nutritional history has been characterized by dietary diversity and flexibility, both of which were essential for coping with environmental seasonality and expansion into new geographic settings. As the book underscores, the human diet continues to evolve, and the authors’ goal is to bring an understanding of this evolving human diet to public health nutrition practices. Nutrition transitions are of interest because they alter the context in which humans live, presenting them with new suites of stressors, while relaxing others. On the broad scale, nutrition transitions are associated with changes in political, economic, and social structures. These, in turn, are associated with more proximal-level factors, such as diet (the focus of this discussion), activity patterns, demography, and disease exposure. Biological anthropologists have focused on understanding how these changing contexts influence human biological variation, especially in terms of health and morphology. The shifts in dietary patterns that accompanied the adoption of plant and animal domestication during the Neolithic have been a topic of considerable interest among biological anthropologists and the AJPA has and continues to serve as a key venue for these discussions. An enduring question that guides this research is: how did the transition to agriculture affect human health and morphology? A large body of literature published over the past 40–50 years indicates that dietary changes in the Neolithic negatively affected human health and were associated with reduced skeletal robusticity (Cohen & Armelagos, 1984; Larsen, 1995, 2006). Many recent studies continue to support earlier findings. For example, Mummert et al. (2011) conclude in their review that data published after Cohen and Armelagos (1984) generally confirm past research indicating plant and animal domestication had a negative impact on adult stature. Temple and Larsen (2007) found poorer oral health among agricultural versus hunter-gatherer groups in Japan. Data collected by May and Ruff (2016) on femoral morphology among Natufian hunter-gatherers and Neolithic farmers in the Levant support previous studies indicating a decline in skeletal robusticity with the transition to agriculture. And, Noback and Harvati (2015), in their analysis of 3D landmark datasets of crania from populations with known subsistence patterns, found some expected correlations between diet and cranial shape, but point out that the most significant differences were between populations dependent on plant- vs. animal-based diets, rather than between foragers and agriculturists. In the past decade, research on the transition to agriculture in the Neolithic has expanded into new geographic zones and researchers have increasingly adopted a biocultural approach in study design and data interpretation as advocated by Goodman and Leatherman (1998). Additionally, utilization of the latest technologies and genetic data have allowed biological anthropologists to test new research questions including the possibility that the foods associated with animal and plant domestication served as selective forces shaping modern genetic variation, both human and microbial. These advances have complicated our understanding of the impact of dietary change in the Neolithic on human health and morphology and in doing so are pushing the field forward. We argue that the most important contributions of this recent scholarship are: (1) biocultural interpretations of observed inter- and intra-population variation in health with plant and animal domestication; (2) use of stable isotopes to refine our understanding of variation in the pace and pattern of dietary change across geographic regions; and (3) evidence of human genetic adaptations to domesticated foods. A good example of a biocultural interpretation is the work of Willis and Oxenham (2013) on the decline in oral health associated with the adoption of agriculture in Southeast Asia. Their analysis of intra-population variation, as well as knowledge of the cariogenic properties of the staple crop, rice, led them to suggest that demographic change, particularly the increase in fertility that accompanied the transition to agriculture, rather than dietary changes per se, best explained the dental patterns observed. As pregnancy is associated with depression of the immune system, a higher fertility rate would place females at increased risk of poor oral health, regardless of changes in diet (Lukacs, 2008). Eshed, Gopher, Pinhasi, and Hershkovitz (2010) also argue that factors other than diet accounted for the observed changes in health status with the transition to agriculture in the Levant. These studies employ a biocultural framework in that they draw attention to intra-population variation and consider how the biological and sociocultural changes that accompanied shifts in diet during the Neolithic placed some individuals at greater risk of poor health than others. These biocultural interpretations challenge earlier arguments that plant and animal domestication had a more uniformly negative effect on human health and compel us to consider alternative interpretations of the data. The use of stable isotopes to study past diets across a range of ecological contexts has significantly refined our understanding of dietary change in the Neolithic. While too coarse to identify specific foods, stable isotopes are providing new insights into the pace and pattern of dietary change. For example, while dietary changes associated with the Neolithic were abrupt in some regions (Richards et al., 2003; Schulting & Richards, 2002), recent research from the Mediterranean (Lelli et al., 2012) and South America (Santana-Sagredo, Lee-Thorp, Schulting, & Uribe, 2015) provide evidence of more gradual change where local diets included foraged, hunted, and fished resources, along with newer agricultural products over an extended period of time. Santana-Sagredo et al. (2015), in their analysis of carbon and nitrogen isotopes collected from the collagen of bone/dentin of individuals from the Taracapá cemetery (1000 BC–AD 900) located in the Atacama Desert, showed that people continued to rely on gathered plant foods, as well as marine resources obtained through trade with coastal groups even with the adoption of maize. These studies, among others, including those in the recent volume Human Bioaracheology in the Transition to Agriculture, edited by Pinhasi and Stock (2011), highlight the importance of the local context in shaping dietary transitions. Advancements in genetics provide compelling new evidence that foods associated with the adoption of agriculture and animal husbandry in the Neolithic acted as a recent selective force shaping human genetic variation. Two key studies come to mind. The first is the work of Perry et al. (2007) on the amylase gene (the enzyme required for starch hydrolysis). They found significantly more copies of the gene in groups historically reliant on starchy staple foods (e.g., corn, millet, rice, sorghum, tubers, and wheat), such as the Japanese, European Americans, and Hadza hunter-gatherers, as compared to pastoral groups such as the Datog and Yakut who are more dependent on animal products. The second is the work of Tishkoff et al. (2007) who considered milk, specifically lactose (milk sugar), as an environmental stressor. Considering both haplotype frequencies and phenotypic expression, they found significant differences in the persistence of lactase production among 43 different ethnic groups from Kenya, Sudan, and Tanzania based on their historical dependence on dairying. Researchers are using the newest genetic technology and data to explore the effects of dietary change in the Neolithic on human variation in the oral microbiome. Adler et al. (2013) used samples of dental calculus collected from 34 European skeletons spanning a ∼7,500-year period from the Mesolithic (hunter-gatherer) to medieval periods (rural and urban) to identify changes in the oral microbiome with the transition to agriculture. They found a distinct shift in the bacterial composition of the oral microbiota with the adoption of an agriculture-based diet. In addition, the frequency of S. mutans, a highly cariogenic species, was significantly higher in the modern samples. The authors argue this is indicative of increased consumption of processed foods that began during the Industrial Revolution (∼200 years ago). While the collection of data on the composition of the gut microbiome of Neolithic populations is not possible, a recent study comparing BaAka pygmies who continue to rely on hunting and gathering with Bantu agriculturists revealed significant differences between the two groups reflective of their different dietary ecologies (Gomez et al., 2016). The diet and nutrition of contemporary human populations emerged as topics of significant interest in biological anthropology in the 1970's and 1980's. The interest grew from two complimentary threads. One was the development of a strong focus on adaptation to the environment, a focus stimulated in part by participation in the Human Adaptability (HA) component of the decade long International Biological Program (1964–1974). In this focus, nutrition was conceptualized as a key component of human adaptation to the environment, and diet was viewed as a potential resource as well as a potential stressor (Haas & Harrison, 1977). It is worth noting that in the 1970's and 1980's there was also interest in the role of nutritional adaptation in all of anthropology. This interest was reflected in an influential edited volume entitled Biosocial Interrelations in Population Adaptation (Watts, Johnston & Lasker, 1975) which brought together anthropologists from all subfields. Another thread in the developing interest in nutrition in biological anthropology was the focus on undernutrition and nutritional deficits. This thread is represented in the edited volume, Social and Biological Predictors of Nutritional Status, Physical Growth and Neurological Development (Greene & Johnston, 1980). As Johnston makes clear in his introductory chapter, the cause of nutritional deficiencies has less to do with per capita food availability at the national level than food distribution to individuals at the local level. Hence, understanding the complexities of sociocultural factors at the local level is key. The volume brings together issues related to the assessment of nutritional status, as well as analytical and conceptual advances in assessing the social and biological predictors of nutritional status. The assessment of nutritional status in biological anthropology is typically in terms of anthropometric measurements of body size (height and weight) and composition (fat versus lean tissue), and that was reflected in the volume. Many of the analytical advances presented were in the use of multivariate statistical techniques to examine the many potential correlates of nutritional status. Two papers were particularly influential. One was Cassidy's (1980) thoughtful biocultural analysis of the many factors that can contribute to toddler malnutrition. Her provocative idea of benign neglect, i.e. caregiver practices embedded in cultural beliefs and customs that can indirectly potentiate toddler malnutrition, was a conceptual advance in explaining toddler malnutrition in biocultural rather than simply biomedical terms. The other influential paper was Greene's (1980) contribution on the complex inter-relationships among biological and cultural factors that affected neurological function in a highland Ecuadorian population where protein-calorie malnutrition and iodine deficiency were prevalent. A strength of Greene's analysis was his ethnographic understanding of the population, which he acknowledged as critical to his ability to interpret the complexity of the data, especially the developmental data. Indeed, the papers by both Cassidy and Greene argue for the necessity of ethnographic data in understanding health. These threads continued to frame the broad range of nutrition-related interests of biological anthropologists into the 1990's. Many of those interests coalesced around responses to nutritional deficits, and especially the energy deficits so often observed in small-scale, subsistence-level populations. A classic example of biobehavioral responses to energy deficits is captured by research on the Turkana (Leslie et al., 1999). The Turkana are pastoralists who, at the time of the study, were living in a dry savannah ecosystem in Kenya. This was an ecosystem subject to strong seasonal variation in rainfall which translated to seasonal variation in forage for animals, and hence food for humans. Researchers were able to demonstrate both the biological (body weight and child growth) and behavioral (low levels of physical activity) responses to seasonal low energy availability. In doing so they were able to explain the role nutrition played in human adaptation to the dry savannah ecosystem (Leslie et al., 1999). Another set of interests developed around issues of diet, nutrition, and health observed in populations as they modernized and adopted the diets and physical activity patterns associated with “Westernization.” Early work included the long-running research program among Samoans. For example, Bindon (1982) documented the decline in consumption of traditional foods and their replacement with Western food items, as well as a decline in the nutritional adequacy of the diet with increased levels of modernization by studying Samoan populations living in western Samoa, American Samoa and Hawaii (Bindon, 1984). Using a similar design, Galanis et al. (1999) documented a rise in consumption of foods associated with cardiovascular disease with modernization and increases in income. Other studies have documented differences in the rates of obesity and risk of chronic disease among Samoans living in these three different contexts (Crews, 1988; McGarvey et al., 1993).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call