Abstract

Laser-induced breakdown spectroscopy (LIBS) is an ideal real-time on-line method for the detection of minor elements in alloy steel, but it needs to solve the undesired self-absorption effect in plasma. Since the plasma electron temperature (T), radiation particle number density and absorption path length (Nl) determine the degree of self-absorption and affect the corrected spectral line intensity, a new temperature iterative self-absorption correction method based on the plasma thermal equilibrium radiation model is proposed to continuously calculate and correct these two parameters. Compared with generally applied self-absorption correction methods, this method has obvious advantages of simple programming, high computational efficiency, and independence of the availability or accuracy of Stark broadening coefficients. The quantitative analysis results of Cu show that the linearity of calibration curves and the measurement accuracy of elemental content are significantly improved with the self-absorption correction. In addition, this method can directly obtain the accurate radiation particle number density and absorption path length, which benefits the plasma diagnostics and quantitative analysis.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.