Abstract

With advancements in machine learning technologies, artificial neural networks (ANNs) are being widely used to improve the performance of gamma-ray spectroscopy based on NaI(Tl) scintillation detectors. Typically, the performance of ANNs is evaluated using test datasets composed of actual spectra. However, the generation of such test datasets encompassing a wide range of actual spectra representing various scenarios often proves inefficient and time-consuming. Thus, instead of measuring actual spectra, we generated virtual spectra with diverse spectral features by sampling from categorical distribution functions derived from the base spectra of six radioactive isotopes: 54Mn, 57Co, 60Co, 134Cs, 137Cs, and 241Am. For practical applications, we determined the optimum counting time (OCT) as the point at which the change in the Kullback–Leibler divergence (ΔKLDV) values between the synthetic spectra used for training the ANN and the virtual spectra approaches zero. The accuracies of the actual spectra were significantly improved when measured up to their respective OCTs. The outcomes demonstrated that the proposed method can effectively determine the OCTs for gamma-ray spectroscopy based on ANNs without the need to measure actual spectra.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call