Abstract

Dual-energy computed tomography (CT) scanning is a rapidly emerging imaging technique employed in nondestructive evaluation of various materials. CT has been used for characterizing rocks and visualizing multiphase flow through rocks for over 25 years. The most common technique for dual-energy CT scanning relies on homogeneous calibration standards to produce the most accurate decoupled data. However, the use of calibration standards with impurities increases the probability of error in the reconstructed data and results in poor rock characterization. Laser-induced breakdown spectroscopy was used to determine impurity concentration in a set of commercially purchased calibration standards used in dual-energy scanning for material identification with coal samples. Two calibration models were developed by using univariate calibration with the internal ratio method and multiple linear regression. Seven elements (Al, Fe, Mg, Na, Ni, Sr, and Ti) were examined in five different samples containing varying amounts of each ion to compare calibration from univariate data analysis and from multivariate data analysis. The contaminant concentrations were also measured by a commercially available inductively coupled plasma optical emission spectroscopy instrument, and the data were used as a reference in developing calibration curves for a modified version of the single linear regression model and the multiple linear regression model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call