Abstract

Synthetic aperture radar (SAR) remote sensing is a potential technique for long-term monitoring of landslide-prone areas. Pixel offset tracking methods work well for fast-moving landslides (more than tens of cm/yr). However, existing methods may present some limitations, such as (i) a high dependence of estimation window design on experience, (ii) a tradeoff between the accuracy of single points and the overall efficiency, and (iii) a low confidence in the results caused by heterogeneous in-window samples. In this paper, an improved offset tracking method is proposed to address these problems. First, the workflow is optimized by a “preseparation” step added before offset estimation to distinguish between feature matching and speckle pattern matching. The optimized workflow is more efficient for natural scenes containing both feature and non-saliency regions. Second, an improved algorithm called adaptive incoherence speckle offset tracking based on homogeneous samples (AISOT-HS) is proposed for non-saliency regions. Its two key points are (i) adaptive design of the optimal estimation windows by introducing a coherence map as a guide and (ii) offset estimation without heterogeneous samples. We apply the proposed method to study the evolution of the 2018 Jinsha River landslide (Tibet, China) using SAR data from the Gaofen 3 (GF-3) satellite and the Phased Array type L-band Synthetic Aperture Radar-2 (PALSAR-2) system onboard the Advanced Land Observing Satellite (ALOS) satellite. Compared with the traditional method, the proposed method improves the efficiency and reduces the uncertainty. We also analyze the spatiotemporal displacement pattern of this landslide, which shows that the Jinsha River landslide was most likely a thrust load-caused landslide. This study demonstrates the role of SAR remote sensing in global landslide monitoring, especially where ground truth data are scarce.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.