Abstract
When different target surfaces, in three-dimensional space, are mapped onto an image plane, they have different projections. These projections vary with the viewpoint. These local differences have influence on the accuracy of image stitching. Most of the existing image stitching methods divide an input image into a number of fixed-size cells, and the pixels within the same cell are then warped using the same local transformation model for the alignment. These methods are based on the hypothesis that the transformation models in one cell are consistent. However, this hypothesis does not hold in general. In this paper, we propose a novel projective-consistent plane based image stitching method (termed PCPS). It divides the overlapping regions of an input image into some projective-consistent planes according to the normal vectors’ orientations of local regions and the reprojection errors of aligned images. The local projective transformation model is estimated for each projective-consistent plane. And then, a hybrid warping model is estimated. For the pixels in overlapping regions, the local projective transformation models are adopted to achieve a better alignment. While for the pixels in non-overlapping regions, a global projective transformation model is estimated by using the inliers uniformly distributed in the projective-consistent planes to avoid distortion. Compared with the state-of-the-art image stitching methods, the experimental results on a number of challenging image sequences show that the projective transformation model estimated by the proposed PCPS method for each projective-consistent plane is more accurate, and the achieved stitching results have less seams and projective distortion.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.