Abstract

Landslides are considered to be among the most alarming natural hazards. Therefore, there is a growing demand for databases and inventories of these events worldwide, since they are a vital resource for landslide risk assessment applications. Given the recent advances in the field of image processing, the objective of this study is to evaluate the performance of a deep convolutional neural network architecture called U-Net for the mapping of landslide scars from satellite imagery. The question that drives the study is: can fully convolutional neural networks be successfully applied as the backbone of automatic frameworks for building landslide inventories, keeping or improving the identification accuracy and agility when compared to other methods? To seek for an answer to it, scenes from the Landsat-8 satellite of a region of Nepal were obtained and processed in order to compose a landslide image database that served as the basis for the training, validation and test of deep convolutional neural networks. The U-Net architecture was applied and the results indicate that it has the potential to identify landslide scars, improving over previously published research on the topic for the same study region. The validation process resulted in recall, precision and F1-score values of 0.74, 0.61 and 0.67, respectively, thus higher than those from previous studies using different methodologies. The results indicate the potential of the method to be applied in dynamic mapping systems for landslide scar identification, which paves the way to the composition and updating of landslide scar databases. These, in turn, can support a great deal of quantitative landslide susceptibility mapping methods that heavily rely on data to provide accurate results.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.