Abstract
Damage identification soon after a large-magnitude earthquake is a major problem for early disaster response activities. The faster the damaged areas are identified, the higher the survival chances of inhabitants. Current methods for damage identification are based on the application of artificial intelligence techniques using remote sensing data. Such methods require a large amount of high-quality labeled data for calibration and/or fine-tuning processes, which are expensive in the aftermath of large-scale disasters. In this paper, we propose a novel semi-supervised classification approach for identifying urban changes induced by an earthquake between images recorded at different times. We integrate information from a small set of labeled data with information from ground motion and fragility functions computed on large unlabeled data. A relevant consideration is that ground motion and fragility functions can be computed in real time. The urban changes induced by the 2023 Turkey earthquake sequence are reported as an evaluation of the proposed method. The method was applied to the interferometric coherence computed from C-band synthetic aperture radar images from Sentinel-1. We use only 39 samples labeled as changed and 9000 unlabeled samples. The results show that our method is able to identify changes between images associated with the effects of an earthquake with an accuracy of about 81%. We conclude that the proposed method can rapidly identify affected areas in the aftermath of a large-magnitude earthquake.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.