Abstract

Abstract: In this paper, we show how the translational motion of a stereo vision system relativeto, and its distance from, the scene can be recovered in closed form directly from the measurementsof image gradients and time derivatives. There is no need to estimate image motion or establishcorrespondences between features across images. The direction of translational motion is recoveredusing a procedure which involves minimizing the sum squared error of a linear constraint equation over the image. The solution is given in terms of the eigenvector corresponding to the smallest eigenvalue of a 3 x 3 positive semi-definite matrix. Using the average disparity, which maximizes thecrosscorrelation between the left and right images, we estimate the scale-factor necessary to computethe magnitude of the translational motion, and consequently the distance to the scene. 1 Introduction An important problem in dynamic scene analysis is the recovery of the motion of a camera relative tothe scene, as well as the distance to the scene, usually expressed in terms of a depth map. Clues fromstereo and motion can be used to produce such information. In binocular stereo, two images capturedfrom different viewpoints, the so-called left and right images, are used to determine the distances topoints on the surfaces of the scene using triangularization methods. This requires establishing thematches in the left and right images of identifiable features, typically points, lines, and contours. Inmotion vision, an image sequence is used to recover the motion of the camera relative to, and a depthmap of, the scene up to a scale factor. To do this, one typically exploits either the correspondencesof prominent features across images or the differential motion of the whole or a large portion of theimage. Each of the motion and stereo problems has its shortcomings.Feature-based methods require the solutions to the feature detection and the correspondence prob-lems, which have proven to be computationally and conceptually difficult to solve. Furthermore,these methods are sensitive to noise since information from only a small portion of the image isused. To reduce sensitivity to noise in motion vision, flow-based methods have been proposed thatuse the estimate of image motion, the so-called opttcal flow, over a large portion of the image torecover depth and/or camera motion. These methods achieve more robustness at the expense ofmore computation to determine an optical flow which is accurate enough for motion estimation. Re-cently, direct methods have been proposed, which are more robust since information over the wholeimage is employed, and require less computation since readily computable data (image brightnessgradients and time derivatives) are used directly to extract depth and motion information. There isno need to compute optical flow or to establish point correspondences.The current interest is in the development of robust and computationally efficient methods to recovermotion and depth information. Theoretically, motion and stereo information can be combined toobtain more accurate estimates of depth and motion; the problem referred to as motion stereo. Someearlier work on this problem is based on the extraction of depth information from stereo cues, whichis then used to estimate motion ([1,3,5,10,11]). Other work assumes that optical flow informationfor the left and right image sequences is known ([4,6,9]). Depth and/or motion information are/isthen estimated from the flow disparity. While motion stereo provides more constraints to establishcorrespondence between features or compute better estimates for optical flow, this has been achievedat the expense of more computation. It is desirable to develop a direct method which does not requirethe computational complexity of intermediate steps (feature detection, correspondence, and opticalflow), yet is more robust than motion vision or stereo techniques used alone. This is the motivationfor pursuing the direct approach.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.