Abstract

In this paper, we present a state-of-the-art precipitation estimation framework which leverages advances in satellite remote sensing as well as Deep Learning (DL). The framework takes advantage of the improvements in spatial, spectral and temporal resolutions of the Advanced Baseline Imager (ABI) onboard the GOES-16 platform along with elevation information to improve the precipitation estimates. The procedure begins by first deriving a Rain/No Rain (R/NR) binary mask through classification of the pixels and then applying regression to estimate the amount of rainfall for rainy pixels. A Fully Convolutional Network is used as a regressor to predict precipitation estimates. The network is trained using the non-saturating conditional Generative Adversarial Network (cGAN) and Mean Squared Error (MSE) loss terms to generate results that better learn the complex distribution of precipitation in the observed data. Common verification metrics such as Probability Of Detection (POD), False Alarm Ratio (FAR), Critical Success Index (CSI), Bias, Correlation and MSE are used to evaluate the accuracy of both R/NR classification and real-valued precipitation estimates. Statistics and visualizations of the evaluation measures show improvements in the precipitation retrieval accuracy in the proposed framework compared to the baseline models trained using conventional MSE loss terms. This framework is proposed as an augmentation for PERSIANN-CCS (Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network- Cloud Classification System) algorithm for estimating global precipitation.

Highlights

  • Near-real-time satellite-based precipitation estimation is of great importance for hydrological and meteorological applications due to its high spatiotemporal resolution and global coverage

  • This study explores the application of the conditional Generative Adversarial Network (GAN) as a type of Generative Neural Networks to estimate precipitation using multiple sources of inputs including multispectral geostationary satellite information

  • The objectives of this study are to report on: (1) application of Convolutional Neural Networks (CNNs) instead of fully connected networks in extracting useful features from GEO satellite imagery to better capture the spatial and temporal dependencies in images; (2) demonstrating the advantage of using more sophisticated loss function to better capture the complex structure of precipitation; (3) evaluating the performance of the proposed algorithm considering different scenarios of multiple channel combinations and elevation data as input; and (4) evaluate the effectiveness of the proposed algorithm by comparing its performance with PERSIANN-CCS as an operational product and a baseline model with a conventional type of loss function

Read more

Summary

Introduction

Near-real-time satellite-based precipitation estimation is of great importance for hydrological and meteorological applications due to its high spatiotemporal resolution and global coverage. The accuracy of precipitation estimates can likely be enhanced with implementation of the recent developments in technologies and data with higher temporal, spatial and spectral resolution. Another important factor to more efficiently and accurately characterize these natural phenomena and their future behavior is the use of the proper methodologies to extract applicable information and exploit it in the precipitation estimation task [1]. The combination of multiple channels of data has been shown to be valuable for cloud detection and improving precipitation estimation [6,7,8,9] Another popular source of satellite-based information is passive microwave (PMW) images from sensors onboard Low-Earth-Orbiting (LEO) satellites. Data from GEO satellites are a unique means to provide cloud-rain information continuously over space and time for weather forecasting and precipitation nowcasting

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.