Abstract

The accuracy of temperature and relative humidity (RH) profiles retrieved by the ground-based microwave radiometer (MWR) is crucial for meteorological research. In this study, the four-year measurements of brightness temperature measured by the microwave radiometer from Huangpu meteorological station in Guangzhou, China, and the radiosonde data from the Qingyuan meteorological station (70 km northwest of Huangpu station) during the years from 2018 to 2021 are compared with the sonde data. To make a detailed comparison on the performance of machine learning models in retrieving the temperature and RH profiles, four machine learning algorithms, namely Deep Learning (DL), Gradient Boosting Machine (GBM), Extreme Gradient Boosting (XGBoost) and Random Forest (RF), are employed and verified. The results show that the DL model performs the best in temperature retrieval (with the root-mean-square error and the correlation coefficient of 2.36 and 0.98, respectively), while the RH of the four machine learning methods shows different excellence at different altitude levels. The integrated machine learning (ML) RH method is proposed here, in which a certain method with the minimum RMSE is selected from the four methods of DL, GBM, XGBoost and RF for a certain altitude level. Two cases on 29 January 2021 and on 10 February 2021 are used for illustration. The case on 29 January 2021 illustrates that the DL model is suitable for temperature retrieval and the ML model is suitable for RH retrieval in Guangzhou. The case on 10 February 2021 shows that the ML RH method reaches over 85% before precipitation, implying the application of the ML RH method in pre-precipitation warnings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call