Abstract

Handheld energy dispersive X-ray fluorescence (hhEDXRF) spectrometers enable rapid and non-destructive means of quantifying the elemental composition of rocks. Advances in hhEDXRF technology have resulted in their widespread usage in the geological sciences. Currently, most popular hhEDXRFs lack lithology-specific calibrations and in turn acquire datasets with high precision but in many cases with unknown or low accuracy. This study derives a methodology to create an internal regression-based mudrock calibration for a hhEDXRF analyzer, with the aim of increasing the analytical quality of elemental data collected from core and outcrop samples. For this study we utilize 64 compositionally diverse Middle Devonian mudrock samples from four cored locations within the central Appalachian basin. Samples were analyzed on both a hhEDXRF and a quantitatively superior bench-top wavelength-dispersive XRF (WDXRF) spectrometer. Utilizing linear regressions, hhEDXRF values were calibrated to the values of the WDXRF for individual elements. Measurements from the two XRF modes record the following coefficients of determination: Zn, Mo, and Ca (r2 > 0.99), Al, K, Ti, Fe, Ni, V, Cu, and Pb (r2 > 0.92), Zr and Si (r2 > 0.78), and Cr (r2 = 0.56). To assess precision, relative standard deviations (RSD) were calculated for each element based on repeated measurements collected on internal reference samples. All elements recorded RSD values of <8.8%, excluding Cr, which recorded values of 14–25%. Magnesium (Mg), Cl, and Ag were below the limit of detection by the hhEDXRF and thus RSD values could not be computed. To assess hhEDXRF accuracy, we propose a novel “iterative blind test” approach, which quantifies accuracy through deriving the percentage difference between the calibrated hhEDXRF value, derived from the blind test, and WDXRF values for each element. Si, Ti, Al, Fe, Ni, Cu, Zn, and Zr record percentage difference values of <10%. For Mn, Ca, Cr, V, Mo and Pb, a minority of samples (<13% on average) record concentrations below their reliable detection limit (RDL). Excluding measurements below the RDL, the elements record percent difference values between 6 and 15%. Potassium yielded measurements outside of the RDL and was the only element that returned percentage difference values of >20%, outside of the acceptable limit. The internal mudrock calibration creates high analytical quality datasets for more robust analyses when compared to previous approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call