Abstract

Entropy-based measures (e.g., mutual information, also known as Kullback-Leiber divergence), which quantify the similarity between two signals, are widely used as similarity measures for image registration. Although they are proven superior to many classical statistical measures, entropy-based measures, such as mutual information, may fail to yield the optimum registration if the multimodal image pair has insufficient scene overlap region. To overcome this challenge, we proposed using the symmetric form of Kullback-Leiber divergence, namely Jeffrey’s divergence, as the similarity measure in practical multimodal image registration tasks. Mathematical analysis was performed to investigate the causes accounting for the limitation of mutual information when dealing with insufficient scene overlap image pairs. Experimental registrations of SPOT image, Landsat TM image, ALOS PalSAR image, and DEM data were carried out to compare the performance of Jeffrey’s divergence and mutual information. Results indicate that Jeffrey’s divergence is capable of providing larger feasible search space, which is favorable for exploring optimum transformation parameters in a larger range. This superiority of Jeffrey’s divergence was further confirmed by a series of paradigms. Thus, the proposed model is more applicable for registering image pairs that are greatly misaligned or have an insufficient scene overlap region.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call