Abstract

A laboratory scale laser induced thermal desorption spectroscopy system is developed and tested on tungsten-deuterium and titanium-deuterium codeposits, and its feasibility as a hydrogenic inventory measurement diagnostic is demonstrated over a range of retention values from 5 × 1019 m-2 to 7 × 1023 m-2 for absorbed laser power densities as low as 8 MW m-2. Codeposit layer samples are grown by magnetron sputtering and immersed in a weak argon rf plasma. A 1 kW fiber laser (λ = 1100 nm) heats the samples up to a peak surface temperature ranging from 900 to 1500 K using pulse widths of 0.5 and 1 s. Spectral line emission from Balmer series Dα and Hα from thermally desorbed deuterium and hydrogen, as well as line emission from argon, are monitored as a function of time using an optical spectrometer with maximum temporal resolution of 1 ms. To correct for wall recycling and pumping speed, and to accurately measure the time evolution of the laser-induced thermal desorption, the raw Dα signal is deconvolved with the system response function, which is obtained by injecting a short burst of D2 to approximate an impulse. Calibration is done with a standard D2 leak, and laser induced desorption spectroscopy deuterium retention values are found to be in good agreement with companion measurements made using conventional temperature programmed desorption on samples from the same codeposit batch.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call