Modern, ultra-trace, analytical methods, coupled with magnetic sector ICP-MS (HR-ICP-MS), were applied to the determination of a large suite of major and trace elements in Iron Age bones. The high sensitivity and un-paralleled signal-to-noise characteristics of HR-ICP-MS enabled the accurate measurement of Ag, Al, As, Ba, Ca, Cd, Ce, Co, Cr, Cu, Fe, La, Li, Mg, Mn, Ni, P, Pb, Pt, Rb, Sr, U, V, and Zn in small bone sections (< 75 mg). Critically, the HR-ICP-MS effectively addressed molecular interferences, which would likely have compromised data generated with quadrupole-based ICP-MS instruments. Contamination and diagenetic alteration of ancient bone are grave concerns, which if not properly addressed, may result in serious misinterpretation of data from bone archives. Analytical procedures and several chemical and statistical methods (Principal Components Analysis — PCA) were studied to assess their utility in identifying and correcting bone contamination and diagenetic alteration. Uncertainties in bone (femur) sampling were characterized for each element and longitudinal variation was found to be the dominant source of sampling variability. However the longitudinal variation in most trace elements levels was relatively modest, ranging between 9 and 17% RSD. Bone surface contamination was evaluated using sequential acid leaching. Calcium-normalized metal levels in brief, timed, dilute nitric acid leaches were compared with similarly normalized interior core metal levels to assess the degree of surface enrichment. A select group of metals (Mn, Co, Ni, Ag, Cd, and Pt) were observed to be enriched by up to a factor of 10 in the bone surface, indicating that that these elements may have a higher contamination component. However, the results of sequential acid leaching experiments indicated that the single acid leaching step was effective in removing most surface-enriched contaminants. While the leaching protocol was effective in removing contaminants associated with the bone surface, there remained potentially significant residual levels of soil-sourced contaminant tracers within the leached bone. To address this issue a mathematical procedure, based on metal/aluminum ratios, was developed to correct—for the soil—contaminant metal pools. Soil correction fractions for the primary anthropogenically mobilized metals evaluated were greatest for Pb (13.6%) followed by As (4.4%), Ag (3.9%), and Cd (0.94%). Although median soil corrections were typically low, many samples did require a much larger correction, thus both bone cleaning and soil corrections may be necessary to realize accurate endogenous bone elemental data. The results of the PCA analysis were remarkably consistent with outcomes from the chemical and elemental ratio protocols evaluated in the study, and suggest that loadings on certain factors will be helpful in screening for soil-biased samples and in identifying diagenetically altered bone. Application of these contamination evaluation and correction tools was made possible by the high-quality, multi-element, datasets produced by HR-ICP-MS. Large variations in bone core concentrations between the 80 Iron Age specimens examined were observed for all the primary trace elements and in many of the supporting elements, even after correction for major contaminant components.
Read full abstract