Abstract

AbstractFloods are among the most devastating natural disasters worldwide. Such disasters are often accompanied by strong precipitation and other weather factors, making it more difficult to identify affected areas. Moreover, synthetic aperture radar (SAR) technology can capture images in a 24‐hr window and penetrate clouds and fog. Change detection (CD) technology based on SAR images is generally utilized to locate disaster‐stricken areas by analyzing the differences between pre‐ and post‐disaster images. However, this method faces two main challenges: the presence of speckle noise, which reduces the difference detection accuracy, and the lack of a suitable SAR data set for flood disaster CD. Therefore, this study proposes a novel two‐stage approach for locating flood disaster areas, known as the denoising‐change detection approach (D‐CDA). The first stage comprises a nine‐layer denoising network with an encoder‐decoder structure known as the SAR denoising network (SDNet). It utilizes a multiresidual block and a parallel convolutional block attention module to extract features during the encoding process to suppress the noise component. In the second stage, a novel convolution neural network is proposed to detect the changes between bitemporal SAR images, namely, the coordinate attention fused network, which combines the siamese network and UNet++ as the backbone, and fuses coordinate attention modules to enhance the change features. Moreover, a CD data set (Zhengzhou flood data set) was constructed using Sentinel‐1 SAR images based on the 2021 flood disaster in Zhengzhou, China. Simulations verify the effectiveness of the proposed method. The experimental results indicate that D‐CDA achieves favorable detection performance in locating flood disaster areas.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.