Abstract

ABSTRACT This paper is concerned with the robust estimation of the optical flow from time-varying images. Most of theexisting methods which estimate the image motion lie within two general classes. The gradient-based methoduses a relationship between the motion of surfaces and the spatial/temporal derivatives of image brightness. Thefeature-matching approach examines the dynamic variation of image structures such as contours. Each motionestimation technique has its strengths and weakness. The goal of this paper is to devise a model which combinesthe feature-matching and the gradient-based methods using multi-resolution image so that more accurate opticalflow field is produced. Our optical flow estimation algorithm is basically coarse-to-fine multi-resolution schemewith the iterative registration for each resolution. At first, the optical flow component along the direction ofspatial gradient, i.e., normal flow, is estimated. Based upon the confidence measure for normal flow, whichrepresents the accuracy of the estimated normal flow, full flow is obtained by an iterative weighted least squaresestimation. To improve the quality of full flow, the iterative registration is applied to reduce the displaced framedifference based on the Gaussian and the Laplacian-of-Gaussian images. With the proposed fusion techniqueof the feature-matching using the band-pass filtered image and the gradient-based method using the low-passfiltered image, we pursue the possibility of combining two independent optical flow estimation methods based onthe weighted multi-constraints.Keywords: dynamic scene analysis, image flow, optical flow, fusion, confidence measure, gradient-based,feature-matching, iterative registration ,

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.