Laser-induced breakdown spectroscopy (LIBS) is an ideal real-time on-line method for the detection of minor elements in alloy steel, but it needs to solve the undesired self-absorption effect in plasma. Since the plasma electron temperature (T), radiation particle number density and absorption path length (Nl) determine the degree of self-absorption and affect the corrected spectral line intensity, a new temperature iterative self-absorption correction method based on the plasma thermal equilibrium radiation model is proposed to continuously calculate and correct these two parameters. Compared with generally applied self-absorption correction methods, this method has obvious advantages of simple programming, high computational efficiency, and independence of the availability or accuracy of Stark broadening coefficients. The quantitative analysis results of Cu show that the linearity of calibration curves and the measurement accuracy of elemental content are significantly improved with the self-absorption correction. In addition, this method can directly obtain the accurate radiation particle number density and absorption path length, which benefits the plasma diagnostics and quantitative analysis.
Read full abstract