Abstract

The robustness of a visual servoing task depends mainly on the efficiency of visual selections captured from a sensor at each robot’s position. A task function could be described as a regulation of the values sent via the control law to the camera velocities. In this paper we propose a new approach that does not depend on matching and tracking results. Thus, we replaced the classical minimization cost by a new function based on probability distributions and Bhattacharyya distance. To guarantee more robustness, the information related to the observed images was expressed using a combination of orientation selections. The new visual selections are computed by referring to the disposition of Histograms of Oriented Gradients (HOG) bins. For each bin we assign a random variable representing gradient vectors in a particular direction. The new entries will not be used to establish equations of visual motion but they will be directly inserted into the control loop. A new formulation of the interaction matrix has been presented according to the optical flow constraint and using an interpolation function which leads to a more efficient control behaviour and to more positioning accuracy. Experiments demonstrate the robustness of the proposed approach with respect to varying work space conditions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.