Abstract

Within the scope of determining the concentration of uranium in ore samples by gamma-ray spectrometry, we tested a series of machine-learning (ML) algorithms with a database including 1288 HPGe gamma spectra measured by Orano Mining. Instead of detecting and identifying peaks, a global interpretation of the spectra is carried out. Two different approaches were used. First, we reduced the size and the dimension of the dataset by selecting 728 spectra acquired with a same experimental setup and by resampling their 8192 channels into 168 energy bands according to the important peaks due to the natural uranium, thorium and potassium activity. Classical ML algorithms have been trained on this reduced dataset and the best uranium concentration predictions show Symmetric Mean Absolute Percentage Error lower than 6%. In a second step, the complete dataset with 1288 gamma spectra including six different measurement setups was used to train deep neural network with a re-sampling of the spectra into 2048 channels. Despite the small dataset, a Convolutional Neural Network (CNN) model provides satisfactory results with mean errors lower than 15% on this broader and more complex dataset in terms of uranium concentrations and experimental setups. These outcomes shows that ML methods can predict uranium concentration with similar uncertainties as classical gamma-ray spectroscopy (10% to 20%), but without requiring an expert knowledge to interpret the spectra.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call