Abstract
We present a robust strategy for docking a mobile robot in close proximity with an upright surface using optical flow field divergence and proportional feedback control. Unlike previous approaches, we achieve this without the need for explicit segmentation of features in the image, and using complete gradient-based optical flow estimation (i.e., no affine models) in the optical flow computation. A key contribution is the development of an algorithm to compute the flow field divergence, or time-to-contact, in a manner that is robust to small rotations of the robot during ego-motion. This is done by tracking the focus of expansion of the flow field and using this to compensate for ego rotation of the image. The control law used is a simple proportional feedback, using the unfiltered flow field divergence as an input, for a dynamic vehicle model. Closed-loop stability analysis of docking under the proposed feedback is provided. Performance of the flow field divergence algorithm is demonstrated using offboard natural image sequences, and the performance of the closed-loop system is experimentally demonstrated by control of a mobile robot approaching a wall.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.