Abstract

The calibration of radiocarbon measurements is based on a number of mathematical assumptions that are rarely considered by users of the various available calibration programs. As 14C ages take on mathematical properties best known from quantum physics, a quantum theoretical approach provides a useful basis to evaluate the reliability of processes of calibration and Bayesian modelling of radiocarbon datasets. We undertake such an evaluation here through a consideration of the mathematics of calibration, the normalization process, and through an archaeological case study. We demonstrate that the normalization function deemed necessary for 14C histogram shape-correction is identical to the default prior widely used in Bayesian calibration. We highlight flaws in default Bayesian calibration algorithms which may affect archaeological studies that are overly reliant on high calibration precision, especially when based on relatively small (N<100) sample sizes. The observed differences between algorithms have consequences for radiocarbon models that claim sub-generational (~25–30 calendar years) precision.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call