Abstract

AbstractCommercial microwave links (CMLs) have proven useful for providing rainfall information close to the ground surface. However, large uncertainties are associated with these retrievals, partly due to challenges in the type of data collection and processing. In particular, the most common case is when only minimum and maximum received signal levels (RSLs) over a given time interval (hereafter 15 min) are stored by mobile network operators. The average attenuation and the corresponding rainfall rate are then calculated based on a weighted average method using the minimum and maximum attenuation. In this study, an alternative to using a constant weighted average method is explored, based on a machine learning model trained to produce actual attenuation from minimum/maximum values. A rainfall retrieval deep learning model was designed based on a long short‐term memory (LSTM) model architecture and trained with disdrometer data in a form that is comparable to the data provided by mobile network operators. A first evaluation used only disdrometer data to mimic both attenuation from a CML and corresponding rainfall rates. For the test data set, the relative bias was reduced from 5.99% to 2.84% and the coefficient of determination (R2) increased from 0.86 to 0.97. The second evaluation used this disdrometer‐trained LSTM to retrieve rainfall rates from an actual CML located nearby the disdrometer. A significant improvement in the overall rainfall estimation compared to existing microwave link attenuation models was observed. The relative bias reduced from 7.39% to −1.14% and the R2 improved from 0.71 to 0.82.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call