Abstract

Floods are a sudden and influential natural disaster, and synthetic aperture radar (SAR) can image the Earth’s surface almost independently of time and weather conditions, making it particularly suitable for extracting flood ranges in time. Platforms such as Google Earth Engine (GEE) can provide a large amount of SAR data and preprocess it, providing powerful assistance for real-time flood monitoring and time series analysis. However, the application of long-term series data combined with recurrent neural networks (RNNs) to monitor floods has been lacking in current research, and the accuracy of flood extraction in open water surfaces remains unsatisfactory. In this study, we proposed a new method of near real-time flood monitoring with a higher accuracy. The method utilizes SAR image time series to establish a gated recurrent unit (GRU) neural network model. This model was used to predict normal flood-free surface conditions. Flood extraction was achieved by comparing and analyzing the actual flood surface conditions with the predicted conditions, using a parameter called Scores. Our method demonstrated significant improvements in accuracy compared to existing algorithms like the OTSU algorithm, Sentinel-1 Dual Polarized Water Index (SDWI) algorithm, and Z-score algorithm. The overall accuracy of our method was 99.20%, which outperformed the Copernicus Emergency Management Service (EMS) map. Importantly, our method exhibited high stability as it allowed for fluctuation within the normal range, enabling the extraction of the complete flood range, especially in open water surfaces. The stability of our method makes it suitable for the flood monitoring of future open-access SAR data, including data from future Sentinel-1 missions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.