Abstract

To improve the accuracy of quantitative precipitation estimation (QPE), numerous models have been developed for merging satellite and gauge precipitation. However, most established merging methods consider the spatial or temporal correlation between satellite data and rain gauge data separately, and the produced merged precipitation is still limited by low spatial resolution and regional inaccuracy. In this study, a deep fusion model is proposed to merge the TRMM 3B42 V7 satellite data, rain gauge data and thermal infrared images by exploiting their spatial and temporal correlations simultaneously. Specifically, the convolutional neural network (CNN) is combined with a long-short-term memory network (LSTM), where the spatial characteristics of satellite, rain gauge and thermal infrared data are extracted by CNN, while their time dependence is captured by LSTM. Experiment results on 796 rain gauges in China show that the proposed CNN-LSTM model outperforms the comparative models (CNN, LSTM, Multi-Layer Perception). It can improve the accuracy of the original TRMM data in China (reducing the root mean square error (RMSE) and mean absolute error (MAE) by 17.0% and 14.7%, respectively and increasing the Correlation Coefficient to 0.72), even for different precipitation intensities and gauge sparse regions. Finally, a dataset of merged daily precipitation from 2001 to 2005 with a higher resolution of 0.05° and higher accuracy over China is produced. This study provides a useful tool and a valuable dataset for QPE in China, which would benefit water research and water resource management.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call