Abstract

Remote and on-line measurement of chromium on structural steel surface in nuclear power plants is critical for protection against fluid accelerated corrosion. To improve the insufficient sensitivity of fiber-optic laser-induced breakdown spectroscopy toward trace element detection, a dual-pulse spectral enhancement system is set up. In an iron matrix, for the purpose of improving sensitivity of trace chromium analysis and reducing the self-absorption of iron, the effects of key parameters are investigated. The optimal values of the parameters are found to be: 450 ns inter-pulse delay, 700 ns gate delay, 30 mJ/6 mJ pulse energy ratio, and 19.8 mm lens-to-sample distance (corresponding to a 799 μm laser focused spot size). Compared to the single-pulse system, the shot number of dual-pulse ablation is limited for reducing surface damage. After the optimization of the dual-pulse system, the signal-to-noise ratio of the trace chromium emission line has been improved by 3.5 times in comparison with the single-pulse system, and the self-absorption coefficient of matrix iron has been significantly reduced with self-reversal eliminated. The number of detectable lines for trace elements has more than doubled thus increasing the input for spectral calibration without significantly increasing the ablation mass. Three calibration methods including internal standardization, partial least squares regression and random forest regression are employed to determine the chromium and manganese concentrations in standard samples of low alloy steel, and the limit of detection is respectively calculated as 36 and 515 ppm. The leave-one-out cross validation method is utilized to evaluate the accuracy of chromium quantification, and the concentration mapping of chromium is performed on the surface of a steel sample (16MND5) with a relative error of 0.02 wt%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call