Abstract

Spectroscopic indicators of bone crystallinity such as the infrared splitting factor (IRSF) are commonly used to determine the general state of preservation of ancient bone. In principle such indices might be expected to act as a proxy for alteration of bone mineral and thus could be used to screen bones (or portions of bones) for likely preservation of in vivo biogenic trace element and stable isotope signals. We tested the relationship between IRSF and bone mineral composition in two suites of well-characterised recent and Pleistocene bones. Initially, crystallinity change and trace element uptake are correlated, apparently both controlled by decomposition of the organic phase and exposure of bone crystal surfaces. This relationship breaks down in older bones where authigenic phosphate growth and mineral–pore water interactions are no longer rate-limited by the breakdown of collagen and exposure of crystal surfaces. In these conditions the extent of chemical alteration of bone will be controlled by site specific conditions, and thus while FTIR spectra of bone provide a broad indication of organic content and apatite recrystallisation, they are not reliable proxies for the degree of diagenetic alteration in terms of biogenic geochemical signals.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.