Abstract

In most visual servoing methods for wheeled mobile robots, the camera is supposed at the center of the robot, while it will bring much convenience for practical use when putting the camera a little away from the robot center. In this paper, to handle the challenge that is incurred by the uncalibrated translational camera-to-robot parameters, a novel visual servoing scheme is designed for wheeled mobile robots to track a given trajectory, which is defined by a sequence of images recorded previously. The trajectory tracking errors are defined with respect to both current and desired poses of the robot, and the open-loop error system is obtained after analyzing kinematic models of the visual robot system. Subsequently, to compensate both unknown extrinsic parameters and the visual depth, an adaptive visual servo tracking controller is elaborately designed, with disposing the nonholonomic motion constraint of the mobile robot. The tracking errors are strictly proven converging to zero asymptotically by employing Lyapunov techniques and Barbalat's lemma. Both simulation and comparative experiments are conducted to prove that the proposed strategy can drive the robot track the desired trajectory efficiently, even translational camera-to-robot parameters are unknown.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call