Abstract

• The influence of various factors on a relative error of determination of H content in a sample by LIBS using the Saha-Boltzmann (SB) plot is investigated. • It is shown that for typical LIBS pulse parameters (10 9 W/cm 2 , 12 ns duration) the error can be quite large, about 70%. • Even in the case of largest error SB plot remains a straight line making it difficult to anticipate the error from the experimental results. • The pulse power increase tends to reduce the relative error. Laser-induced breakdown spectroscopy (LIBS) is an in situ method of determining hydrogen (H) content in plasma-facing materials in tokamak fusion reactors. Observing radiation from the plasma plume produced by a powerful laser pulse during the target exposition characterizes the sample composition. This is typically accomplished using the Saha-Boltzmann (SB) plot technique under local thermodynamic equilibrium (LTE) conditions. Despite many experimental studies dedicated to applying LIBS to determine H isotope retention in fusion reactor materials, the current understanding of this method’s intrinsic accuracy remains inadequate. In this report, we use numerical calculations to estimate the relative error of determining H content in a sample using LIBS. As an example. we consider LIBS to study a W sample loaded with H in a vacuum. Under typical LIBS pulse parameters (10 9 W/cm 2 and 12 ns duration), the error can be quite large, approximately 70%. We demonstrate that the error tends to decrease as the laser pulse intensity increases. Various factors contributing to the relative error are examined and their dependence on the LIBS plasma parameters is discussed. The SB plot remains a straight line even when LTE conditions are violated, making it difficult to anticipate the experimental results’ error.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call