Abstract
Non-rigid multi-modal image registration plays an important role in medical image analysis. It remains a challenging problem due to the significant intensity distortion and the non-rigid transformation between images. Existing registration methods based on the information theoretic measures and image representations cannot address this problem effectively. In this paper, we have proposed a novel self-similarity inspired local descriptor to determine the similarity metrics, a key component in image registration. The self-similarity is determined based on the Zernike moments of image patches in a local neighborhood, and it is utilized to generate the local descriptor. The Euclidean distance between the local descriptors computed in the reference and moving images is used as the similarity metrics. Distinctively, the proposed local descriptor can provide an effective representation of complicated image features due to its robustness to noise and rotational invariance, which provides the proposed method with good registration performance. Extensive experiments on both simulated and real multi-modal image datasets demonstrate that the proposed method has higher registration accuracy appreciated by the target registration error (TRE) than the state-of-the-art registration methods based on the normalized mutual information (NMI), the sum of squared differences on entropy images (ESSD), the Weber local descriptor (WLD) and the modality independent neighborhood descriptor (MIND).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.