Abstract

Distributions of cut and tooth marks on the bones of large animals found in archaeological sites are increasingly used as sources of inference about the relative importance of hunting and scavenging in early human diets, and (by extension) about the role of meat-eating in human evolution. Here we review the empirical basis for these inferences in light of ethnoarchaeological data from the Tanzanian Hadza, a modern East African foraging population. Comparison of the Hadza data with those produced by other actualistic work indicates that while there may be a relationship between cut and tooth mark distributions and order of consumer access (human- versus carnivore-first), it is less clear-cut than many have suggested. Application of these results to the analysis of Plio-Pleistocene archaeological collections is further complicated by inconsistencies in the ways cut and tooth marks have been defined and counted, and by significant differences between patterns observed in modern control samples and those reported at ancient sites. These observations indicate that cut and tooth mark analyses are unlikely to speak effectively to questions about early human carnivory in the absence of: (1) better-warranted, more comprehensive expectations about the potential range of variation in past human carcass acquisition strategies, (2) a larger, more rigorously designed set of control experiments that model the archaeological consequences of these strategies, and (3) a larger, more consistently analysed archaeological data base. Even if these requirements are met, the idea of meat-eating as an important catalyst in the evolution of early humans will remain highly problematic, mainly due to problems involving the frequency and short-term reliability of carcass access.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call