Image stitching with large parallax poses a significant challenge in the field of computer vision. Existing seam-based approaches attempt to address parallax artifacts by stitching images along seams. However, issues such as object mismatches, disappearances, and duplications still arise occasionally, primarily due to inaccurate alignment of dense pixels or inappropriate seam estimation methods. In this paper, we propose a robust seam-based parallax-tolerant image stitching method that leverages dense flow estimation from state-of-the-art approaches. Firstly, we develop a seam estimation method that does not require pre-estimation of image warping model. Instead, it directly estimates the seam by measuring the local smoothness of the optical flow field and incorporating a penalty term for duplications. Subsequently, we design an iterative algorithm that utilizes the location of estimated seam to solve a spatial smooth warping model and eliminate outlier corresponding pairs. By employing this approach, we effectively address the intertwined challenges of estimating the warping model and seam. Experiment on real-world images shows that our proposed method achieves superior local alignment accuracy near the stitching seam and outperforms other state-of-the-art techniques on visual stitching result. Code is available at https://github.com/zhihao0512/dense-matching-image-stitching.
Read full abstract