Abstract

Non-rigid multi-modal image registration plays an important role in medical image analysis. It remains a challenging problem due to the significant intensity distortion and the non-rigid transformation between images. Existing registration methods based on the information theoretic measures and image representations cannot address this problem effectively. In this paper, we have proposed a novel self-similarity inspired local descriptor to determine the similarity metrics, a key component in image registration. The self-similarity is determined based on the Zernike moments of image patches in a local neighborhood, and it is utilized to generate the local descriptor. The Euclidean distance between the local descriptors computed in the reference and moving images is used as the similarity metrics. Distinctively, the proposed local descriptor can provide an effective representation of complicated image features due to its robustness to noise and rotational invariance, which provides the proposed method with good registration performance. Extensive experiments on both simulated and real multi-modal image datasets demonstrate that the proposed method has higher registration accuracy appreciated by the target registration error (TRE) than the state-of-the-art registration methods based on the normalized mutual information (NMI), the sum of squared differences on entropy images (ESSD), the Weber local descriptor (WLD) and the modality independent neighborhood descriptor (MIND).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call