Evolving hurricane patterns intensified by climate change are expected to exacerbate economic hardships on coastal communities. Climate resilience for these communities requires both the capability to recover rapidly from devastating storms, and the ability to develop an accurate and actionable understanding of vulnerabilities to reduce the impact of future storms. Available data from past storms can provide invaluable insight in addressing both these requirements. Post-disaster preliminary damage assessments (PDAs) are a crucial initial step toward a rapid recovery. They also provide the most accurate information on the performance of various types of dwellings after the storm. Traditional door-to-door inspection methods are time-consuming and can hinder efficient resource allocation by governments in the aftermath. To address this, researchers have proposed automated PDA frameworks, often utilizing data from satellites, combined with deep convolutional neural networks. However, before such frameworks can be adopted in practice, the accuracy and fidelity of predictions of damage level at the scale of an entire building must be comparable to human assessments. To bridge this gap, we present an innovative PDA framework that leverages Ultra-High-Resolution Aerial (UHRA) images alongside state-of-the-art transformer models for multi-class damage predictions across entire buildings. Our approach leverages vast amounts of unlabeled data to enhance accuracy of prediction and generalization capabilities. Through a series of experiments, we evaluate the influence of incorporating unlabeled data, and transformer models. By integrating UHRA images and semi-supervised transformer models, our findings indicate that this framework overcomes critical limitations associated with satellite imagery and traditional CNN models, achieving an 88% multiclass accuracy, ultimately leading to more precise, efficient, and reliable damage assessments that are a first step towards building more climate resilient societies.
Read full abstract