Abstract

Self-absorption of spectral lines is known to lower the performance of analytical measurements via calibration-free laser-induced breakdown spectroscopy. However, the error growth due to this effect is not clearly assessed.Here we propose a method to quantify the measurement error due to self-absorption based on the calculation of the spectral radiance of a plasma in local thermodynamic equilibrium. Validated through spectroscopic measurements for a binary alloy thin film of compositional gradient, the method evidences that measurement performance lowering due to self-absorption depends on the spectral shape of the analytical transition and on the intensity measurement method. Thus, line-integrated intensity measurements of Stark broadened lines enable accurate analysis, even at large optical thickness, if line width and plasma size are precisely known. The error growth due to self-absorption is significantly larger for line shapes dominated by Doppler broadening and for line-center intensity measurements. The findings present a significant advance in compositional measurements via calibration-free laser-induced breakdown spectroscopy, as they enable straightforward selection of most appropriate analytical lines.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call