Abstract

In urban change detection, coregistration between bi-temporal Very High Resolution (vhr) images taken from different viewing angles, especially from high off-nadir angles, is very challenging. The relief displacements of elevated objects in such images usually lead to significant misregistration that negatively affects the accuracy of change detection. This paper presents a novel solution, called Patch-Wise CoRegistration (pwcr), that can overcome the misregistration problem caused by viewing angle difference and accordingly improve the accuracy of urban change detection. The pwcr method utilizes a Digital Surface Model (dsm) and the Rational Polynomial Coefficients (rpcs) of the images to find corresponding points in a bi-temporal image set. The corresponding points are then used to generate corresponding patches in the image set. To prove that the pwcr method can overcome the misregistration problem and help achieving accurate change detection, two change detection criteria are tested and incorporated into a change detection framework. Experiments on four bi-temporal image sets acquired by Ikonos, GeoEye-1, and Worldview-2 satellites from different viewing angles show that the pwcr method can achieve highly accurate image patch coregistration (up to 80 percent higher than traditional coregistration for elevated objects), so that the change detection framework can produce accurate urban change detection results (over 90 percent).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call