Abstract

Abstract. Following the devastating earthquakes in Kahramanmaraş province, Türkiye, on February 6, 2023, which resulted in the loss of over 50,000 lives and damage to more than 84,000 buildings, the pressing need for efficient damage assessment and response became apparent. Traditional on-site assessments are time-consuming and perilous. Most existing research leans towards segmentation and object detection methods for earthquake damage assessment using remotely sensed data, which demand substantial training data, computational resources, and time. A more practical approach can be classifying image patches to identify earthquake-affected areas with damaged buildings. This scene classification method categorizes image patches based on their content. Remote sensing scene classification assigns labels to such images using deep learning various algorithms. In this work, we developed a fully automated system utilizing Maxar’s very high-resolution post-earthquake satellite imagery to classify and map the scenes (image patches) involving collapsed and non-collapsed buildings in Antakya and Iskenderun city centers. Our approach involved two key scene classifiers: Classifier #1, employing deep learning models like ResNet-101, effectively detected building presence within the image scene with remarkable accuracy (99.17%). This classifier served as the foundation for identifying the scenes involving buildings for entire city in order to filter the non-urban land use. Then, Classifier #2, classified building scenes into collapsed and non-collapsed categories. The DenseNet-121 model excelled, achieving an accuracy of 93.33% in this task. In the end, Classifier #2 categorized 2,429 non-collapsed and 449 collapsed scenes in Antakya and 2,291 non-collapsed and 290 collapsed scenes in Iskenderun.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.