Abstract

In this paper, we present and implement a hybrid approach to robust visual servoing for autonomous ground vehicles. A vision integrated model for nonholonomic mobile robots is derived that obviates the need for actual depth data for image-based visual servoing by assuming planar motion. A fractional order sliding-mode controller has been designed where the parameters of the sliding surface are adapted in real time. These adaptive laws are so derived that ensure finite-time convergence. An optical flow based heading restoration law is designed to handle the severe external perturbations, i.e., when the visual marker disappears from the field of view of the camera. The heading restoration law is combined with the reinforcement learning to solve this visibility problem. The proposed algorithm enables the robot to reach the home location even when the visual marker momentarily disappears from the camera field of view due to external disturbances. The proposed algorithm is validated in real time using Pioneer P3-DX robots through perturbation studies. Both simulation and experimental results prove the efficacy of the proposed scheme.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.