Abstract
In this paper, we address the problem of ego-motion estimation for a monocular moving camera, which is under arbitrary translation and rotation. It has been one of the most important problems on the application of computer vision in the mobile robots. The problem is equivalent to determine the 3D motion parameters of a camera by observing an image sequence taken by it over time. A robotic system can be well guided if there is a valid way for it to obtain the information about its own motion, so that an accurate estimation of ego-motion is very useful for the robots' navigation. The new method we propose here is uniquely based on the spatial-temporal image derivatives of an image sequence, that is, the normal flow, which is the projection of the optical flow on the direction of the image gradient. The computation of the normal flow, which is the image flow component that can be estimated based on the local measurements alone, does not require any special assumption about the scene structure. This method is less demanding than those methods based on the optical flow about the observed scenes, and does not need to add any special assumption to the observed scenes, so that it can be used wildly in the real world. First, we determine the range of each rotational parameter roughly and we are sure that the ground truth of each rotational parameter must be included in the corresponding range. Second, we search for the ground truth of each rotational parameter in the corresponding range. Once the ground truth of rotational parameters is determined, the location of the Focus of Expansion (FOE) is determined simultaneously, which gives the direction of the camera linear velocity. We have conducted many experiments with synthetic data and real images to verify the accurateness and robustness of the method we propose in this paper
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.