Abstract
Image registration, which aim to establish a reliable feature relationship between images, is a critical problem in the field of image processing. In order to enhance the accuracy of color and depth image registration, this paper proposes an novel image registration algorithm based on multi-vector-fields constraints. We first initialize the edge information features of color and depth images, and establish putative correspondences based on edge information. Consider the correlation between the images, establish the functional relationships of the multi-vector-fields constraints based on the relationships. In the reproducing nuclear Hilbert space (RKHS), this constraint is added to the probability model, and the model parameters are optimized using the EM algorithm. Finally, the probability of corresponding edge points of the image is obtained. In order to further improve registration accuracy, this paper will change the input from one pair to two pairs and let the feature transformation relationship between images be iteratively evaluated using the parameter model. Taking publicly available RGB-D images as experimental subjects, results show that for single object image registration, the algorithm image registration accuracy in this paper is improved by about 5% compared with SC, ICP, and CPD algorithms. In addition, artificial noise was used to test the proposed algorithm’s anti-noise ability, results show that the proposed algorithm has superior anti-noise ability relative to SC, ICP and CPD algorithms.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.