Abstract

Space-borne synthetic aperture radar (SAR) and optical sensors are important tools for building damage detection. Fusion of SAR and optical images improves detection performance. However, when the resolutions of the two different kinds of images differ, the performance of the existing pixel-level fusion methods deteriorates significantly due to interpolation-induced distortion. To solve this problem, this paper presents a new superpixel-based belief fusion (SBBF) model for building damage detection. The superpixels on the SAR and optical images are identified by the segmentation on the pre-earthquake optical image to perform the fusion on the superpixel-level instead of the pixel-level in existing methods. Then in the fusion stage, different from the commonly used direct fusion methods that do not consider the reliability in the fusion process, a novel belief fusion method that employs a basic belief assignment (BBA) to incorporate different reliabilities of superpixels is proposed to improve the accuracy of building damage detection. For each superpixel, the BBA is assigned based on the influence of noise and resolutions. The United Nations Operational Satellite Applications Programme (UNOSAT) datasets corresponding to the 2010 Haiti earthquake and the 2011 Tōhoku earthquake, are used to evaluate the performance of the proposed method. The experimental results show that the proposed method achieves significantly better performance than existing separate SAR or optical images based methods, and the existing pixel-level fusion methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call