Abstract

The emission from a laser-induced plasma of seven self-absorbed Fe(I) spectral lines has been studied to investigate the influence of the optical depth on the line intensity. The plasma was generated using an infrared Q-switched Nd:YAG laser in air at atmospheric pressure. The plasma emission was detected with temporal resolution, using a delay of 5 μs from the laser pulse and a gate width of 1 μs. Experimental curves of growth (COGs) were obtained by measuring the line intensity for Fe–Ni alloy samples with Fe concentrations in the range 0.2–95%. Using a simple model for self-absorption, based in a homogeneous plasma with the absence of matrix effects, theoretical COGs have been calculated that fit the experimental data with good correlation. The method used allows prediction of the COG that will be obtained for a given spectral line, starting from its transition parameters (oscillator strength, energy levels and degeneracy of the lower level), the plasma temperature and the damping constant of the line. The plasma temperature (8200±100 K) was determined using the Boltzmann plot method. The existence of local thermodynamic equilibrium was verified by estimating the plasma electron density (2.6×10 16 cm −3) using the Stark broadening of an emission line. The damping constant (0.9±0.2) was estimated through the determination of the Lorentzian line width from measured line profiles. The density of Fe atoms in the plasma for the sample with 100% Fe (7.3×10 15 cm −3) was estimated using all the COGs of the lines studied. The experimental results indicate that matrix effects are not present in the ablation process of the Fe–Ni samples.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call