Abstract

Gamma radiation has been classified by the International Agency for Research on Cancer (IARC) as a carcinogenic agent with sufficient evidence in humans. Previous studies show that some weather data are cross-correlated with gamma exposure rates; hence, we hypothesize that the gamma exposure rate could be predicted with certain weather data. In this study, we collected various weather and radiation data from an automatic weather system (AWS) and environmental radiation monitoring system (ERMS) during a specific period and trained and tested two time-series learning algorithms—namely, long short-term memory (LSTM) and light gradient boosting machine (LightGBM)—with two preprocessing methods, namely, standardization and normalization. The experimental results illustrate that standardization is superior to normalization for data preprocessing with smaller deviations, and LightGBM outperforms LSTM in terms of prediction accuracy and running time. The prediction capability of LightGBM makes it possible to determine whether the increase in the gamma exposure rate is caused by a change in the weather or an actual gamma ray for environmental radiation monitoring.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call