Stable isotope analysis of carbon (C), nitrogen (N), hydrogen (H), and oxygen (O) of archaeological bone has become an increasingly common research method for interpreting human behavior in the past. However, diagenesis of skeletal material can invalidate stable isotope ratios, thereby compromising interpretations. We examine patterns of bone diagenesis using infrared spectroscopy of bone bioapatite samples in relation to indicators of organic preservation quality indicators. We assess crystallinity, an indicator of bioapatite preservation, using the infrared splitting factor (IR-SF) and carbonate content (carbonate to phosphate ratio, C/P) calculated from infrared spectra. We then test the assumption that if the organic fraction of bone is preserved, the mineral fraction will also be unaffected by postmortem chemical alteration. We analyzed 454 bone bioapatite and extracted bone organic sample pairs from modern, historic, and prehistoric humans. Consistent with previous studies, we observed a strong, statistically significant negative linear relationship between IR-SF and C/P (r = −0.855, p < 0.001). Modern bone bioapatite samples unaltered by diagenesis have low IR-SF and high C/P values. There was no significant association between the collagen yield or atomic C:N ratio and IR-SF or C/P values. The range of variation in IR-SF and C/P values for samples with organic yields between 0 and >20 percent spanned the range of modern bone bioapatite unaltered by diagenesis as well as bone bioapatite significantly affected by diagenesis. The lack of predictive patterning between bone inorganic and organic diagenesis suggests that the depositional context and site formation history play critical and independent roles in the organic and mineral fraction of bone. Thus, the preservation of the organic fraction of bone is not predictive of the preservation of the inorganic fraction (e.g., bioapatite).
Read full abstract