Abstract

The light emitted from plasma created by the interaction of a laser beam with the sample surface is always subject to self-absorption (SA) from its cold outer region. This phenomenon affects the precision and the uncertainty of the quantitative analysis by laser-induced breakdown spectroscopy. To overcome this inherent flaw of spectroscopy, different methods of SA corrections (SACs) have been proposed. In this work laser induced breakdown spectroscopy was performed on a set of metal alloys of reference standard samples and the respective quantitative calibration curves were plotted. The SAC was performed by adopting a method based on the ratio of electron density calculated from Stark broadening of the absorbed line and non-absorbed one emitted by the low concentration element present in the air atmosphere. The quality of the linear relationship between the normalized spectral line intensity and analytic concentration was estimated by the coefficient of determination and the quality coefficient of the regression lines. The validation of this procedure by the measurement of two new standards confirms the improvement of the quantitative analysis using the SAC especially for the single ionized emitted line in laser induced breakdown spectroscopy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call