Abstract

In this paper, a visual servoing approach is developed for the trajectory tracking control and depth estimation problem of a mobile robot without a priori knowledge about desired velocities. By exploiting the multiple images captured by the on-board camera, the current and desired poses (i.e., scaled translation and orientation) of the mobile robot are reconstructed to define system errors. Then, an adaptive time-varying controller is proposed to achieve the trajectory tracking task in the presence of nonholonomic constraint and unknown depth parameters. Most of previous works require the measurement of the desired velocity information to facilitate the controller design, leading to tedious offline computation. In this paper, to eliminate this requirement, the desired velocities are estimated in real-time based on a reduced order observer. Moreover, an augmented update law is designed to compensate for the unknown depth parameters and identify the inverse depth constant. The Lyapunov-based method is employed to prove that the proposed controller achieves asymptotic tracking, and the inverse depth estimate converges to its actual value provided that a persistent excitation condition is satisfied. Subsequently, a robust data-driven algorithm is introduced to ensure the convergence of the inverse depth estimate under a relaxed finite excitation condition. Simulation and experimental results are provided to demonstrate the effectiveness of the proposed approach.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.