Abstract

Information theoretic divergence measures, especially mutual information, have been widely employed in multimodal image registration. Mutual information is the Kullback-Leibler divergence between the joint distribution of the two images' grey values and the product of their marginal distributions. Apart from these metrics, however, there are other measures which could be considered for the task of image registration. In this paper, the connections are investigated between mutual information, Kullback-Leibler divergence and Shannon inequality. In information theory, Inequalities connect information divergence with other measures of discrimination between probability distributions. Based on these connections and inequality theory, a novel definition of generalized divergence measures is proposed. Three classical inequalities, arithmetic-geometric inequality, Cauchy-Schwarz inequality, and Minkowski inequality are analyzed, hence, corresponding divergence measures are given, which are used for registration measures. The measures are applied to rigid registration of multimodal medical images. This paper compares the performance of mutual information with the proposed registration measures. Simulation results show that the new measures are effective and efficient, and may be more robust to noise, and furthermore, some of them need less execution time than mutual information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call