Abstract

Cyber-physical disaster systems (CPDS) are a new cyber-physical application that collects physical realm measurements from IoT devices and sends them to the edge for damage severity analysis of impacted sites in the aftermath of a large-scale disaster. However, the lack of effective machine learning paradigms and the data and device heterogeneity of edge devices pose significant challenges in disaster damage assessment (DDA). To address these issues, we propose a generative adversarial network (GAN) and a lightweight, deep transfer learning-enabled, fine-tuned machine learning pipeline to reduce overall sensing error and improve the model’s performance. In this paper, we applied several combinations of GANs (i.e., DCGAN, DiscoGAN, ProGAN, and Cycle-GAN) to generate fake images of the disaster. After that, three pre-trained models: VGG19, ResNet18, and DenseNet121, with deep transfer learning, are applied to classify the images of the disaster. We observed that the ResNet18 is the most pertinent model to achieve a test accuracy of 88.81%. With the experiments on real-world DDA applications, we have visualized the damage severity of disaster-impacted sites using different types of Class Activation Mapping (CAM) techniques, namely Grad-CAM++, Guided Grad-Cam, & Score-CAM. Finally, using k-means clustering, we have obtained the scatter plots to measure the damage severity into no damage, mild damage, and severe damage categories in the generated heat maps.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call