Net radiation (Rn), a critical component in land surface energy cycling, is calculated as the difference between net shortwave radiation and longwave radiation at the Earth’s surface and holds significant importance in crop models for precision agriculture management. In this study, we examined the performance of four machine learning models, including extreme learning machine (ELM), hybrid artificial neural networks with genetic algorithm models (GANN), generalized regression neural networks (GRNN), and random forests (RF), in estimating daily Rn at four representative sites across different climatic zones of China. The input variables included common meteorological factors such as minimum and maximum temperature, relative humidity, sunshine duration, and shortwave solar radiation. Model performance was assessed and compared using statistical parameters such as the correlation coefficient (R2), root mean square errors (RMSE), mean absolute errors (MAE), and Nash–Sutcliffe coefficient (NS). The results indicated that all models slightly underestimated actual Rn, with linear regression slopes ranging from 0.810 to 0.870 across different zones. The estimated Rn was found to be comparable to observed values in terms of data distribution characteristics. Among the models, the ELM and GANN demonstrated higher consistency with observed values, exhibiting R2 values ranging from 0.838 to 0.963 and 0.836 to 0.963, respectively, across varying climatic zones. These values surpassed those of the RF (0.809–0.959) and GRNN (0.812–0.949) models. Additionally, the ELM and GANN models showed smaller simulation errors in terms of RMSE, MAE, and NS across the four climatic zones compared to the RF and GRNN models. Overall, the ELM and GANN models outperformed the RF and GRNN models. Notably, the ELM model's faster computational speed makes it a strong recommendation for Rn estimates across different climatic zones of China.
Read full abstract