Abstract

Multi-temporal interferometric synthetic aperture radar (InSAR) is an effective tool for measuring large-scale land subsidence. However, the measurement points generated by InSAR are too many to be manually analyzed, and automatic subsidence detection and classification methods are still lacking. In this study, we developed an oriented R-CNN deep learning network to automatically detect and classify subsidence bowls using InSAR measurements and multi-source ancillary data. We used 541 Sentinel-1 images acquired during 2015–2021 to map land subsidence of the Guangdong-Hong Kong-Macao Greater Bay Area by resolving persistent and distributed scatterers. Multi-source data related to land subsidence, including geological and lithological, land cover, topographic, and climatic data, were incorporated into deep learning, allowing the local subsidence to be classified into seven categories. The results showed that the oriented R-CNN achieved an average precision (AP) of 0.847 for subsidence detection and a mean AP (mAP) of 0.798 for subsidence classification, which outperformed the other three state-of-the-art methods (Rotated RetinaNet, R3Det, and ReDet). An independent effect analysis showed that incorporating all datasets improved the AP by 11.2% for detection and the mAP by 73.9% for classification, respectively, compared with using InSAR measurements only. Combining InSAR measurements with globally available land cover and digital elevation model data improved the AP for subsidence detection to 0.822, suggesting that our methods can be potentially transferred to other regions, which was further validated this using a new dataset in Shanghai. These results improve the understanding of deltaic subsidence and facilitate geohazard assessment and management for sustainable environments.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.