Abstract

Laser-induced breakdown spectroscopy (LIBS) is a technique that leverages atomic emission towards element identification and quantification. While the potential of the technology is vast, it still struggles with obstacles such as the variability of the technique. In recent years, several methods have exploited modifications to the standard implementation to work around this problem, mostly focused on the laser side to increase the signal-to-noise ratio of the emission. In this paper, we explore the effect of pulse duration on the detected signal intensity using a tunable LIBS system that consists of a versatile fiber laser, capable of emitting square-shaped pulses with a duration ranging from 10 to 100 ns. Our results show that, by tuning the duration of the pulse, it is possible to increase the signal-to-noise ratio of relevant elemental emission lines, an effect that we relate with the computed plasma temperature and associated density for the ion species. Despite the limitations of the work due to the low-resolution and small range of the spectrometer used, the preliminary results pave an interesting path towards the design of controllable LIBS systems that can be tailored to increase the signal-to-noise ratio and thus be useful for the deployment of more sensitive instruments both for qualitative and quantitative purposes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.