Abstract

BackgroundLaser-induced breakdown spectroscopy (LIBS) is widely applied in various fields, but accuracy issues limit its further development. Signal uncertainty is the main reason that affects the accuracy of LIBS measurements, but the signal uncertainty caused by different plasmas exhibiting different radiation attenuation rates during the integration time is often neglected. There is a need for a method to correct LIBS signals by quantifying the radiation attenuation rate. ResultsIn order to reduce the uncertainty due to different plasma attenuation rates, the attenuation rates of the energy level radiation emitted by plasma are described as attenuation coefficients, which are obtained by linearly fitting the logarithm of the time series of line intensities. The calibration curve was corrected by attenuation coefficients for 4 major elements in 7 standard samples. The results showed that the line intensities corrected by attenuation coefficients showed better linearity with elemental concentrations. SignificanceThis study is important for improving the accuracy of LIBS measurements, and is also significant for modeling the plasma radiative attenuation of laser-induced plasma, and is expected to be applied to spectrometers that can obtain time series spectra of the same plasma to improve the accuracy of in-situ fast LIBS analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call