Abstract

The time control of signal acquisition in laser-induced breakdown spectroscopy (LIBS) is critical to the quantification of elements in samples. Upon interacting with the laser pulse, the plasma expands and then cools rapidly in tens of microseconds, undergoing temperature variations of over 50,000 K. In the LIBS technique, the signal acquisition is controlled with the delay time, which is the time between the emission of the laser pulse and the reading of the spectrum, and the gate width, which is the integration time of the spectrometer. Although the relationship of the delay time with the measured plasma temperature and electron density is well-known in the literature, little is known about how these parameters vary with different gate widths. Thus, the objective of this work was to evaluate how the measured plasma parameters and the signal-to-noise ratio (SNR) of an atomic line change with different values of delay time and gate width. For this study, two sets of samples were prepared: i) pure NaCl and ii) NaCl with 20 wt% H3BO3, and at concentrations of 1 and 3 wt% Ca. TiO2 and CuSO4 were also added to these samples to facilitate plasma temperature and electron density calculations, which were carried out using a Saha-Boltzmann plot corrected with the one-point calibration method. The results of this study indicated that the measured plasma parameters did not vary with the gate width, even though the SNR of the Ca I line at 643.9 nm increased as the gate width did. Furthermore, the use of the concept of local thermodynamic equilibrium was verified even with long gate widths (≥ 20 μs), except for the 4 μs delay time. Studies with other samples and other combinations of LIBS system parameters, e.g., at sub atmospheric pressure, should be done to confirm and expand these findings.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call