Abstract
The light emitted from plasma created by the interaction of a laser beam with the sample surface is always subject to self-absorption (SA) from its cold outer region. This phenomenon affects the precision and the uncertainty of the quantitative analysis by laser-induced breakdown spectroscopy. To overcome this inherent flaw of spectroscopy, different methods of SA corrections (SACs) have been proposed. In this work laser induced breakdown spectroscopy was performed on a set of metal alloys of reference standard samples and the respective quantitative calibration curves were plotted. The SAC was performed by adopting a method based on the ratio of electron density calculated from Stark broadening of the absorbed line and non-absorbed one emitted by the low concentration element present in the air atmosphere. The quality of the linear relationship between the normalized spectral line intensity and analytic concentration was estimated by the coefficient of determination and the quality coefficient of the regression lines. The validation of this procedure by the measurement of two new standards confirms the improvement of the quantitative analysis using the SAC especially for the single ionized emitted line in laser induced breakdown spectroscopy.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.