Abstract

A single downward‐looking camera can be used as a high‐precision visual odometry sensor in a wide range of real‐world mobile robotics applications. In particular, a simple and computationally efficient dense alignment approach can take full advantage of the local planarity of floor surfaces to make use of the whole texture available rather than sparse feature points. In this paper, we present and analyze highly practical solutions for autocalibration of such a camera's extrinsic orientation and position relative to a mobile robot's coordinate frame. We show that two degrees of freedom, the out‐of‐plane camera angles, can be autocalibrated in any conditions, and that bringing in a small amount of information from wheel odometry or another independent motion source allows rapid, full, and accurate six degree‐of‐freedom calibration. Of particular practical interest is the result that this can be achieved to almost the same level even without wheel odometry and based only on widely applicable assumptions about nonholonomic robot motion and the forward/backward direction of its movement. We show the accurate, rapid, and robust performance of our autocalibration techniques for varied camera positions over a range of low‐textured real surfaces, both indoors and outdoors.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.