Abstract

A visual servoing algorithm is proposed for a robot with a camera in the hand to track a moving object in terms of image features and their variations, where fuzzy logics and fuzzy-neural networks are involved to learn feature Jacobian-based kinematic control law. Specifically, novel image features are suggested by employing a viewing model of the perspective projection to estimate the relative pitching and yawing angles. Such perspective projection-based features would not interact with the relative distance between the object and the camera, and, desired feature trajectories for learning the visually guided line-of-sight robot motion are obtained by measuring features by the camera in the hand not in the entire workspace, but on a single linear path along which the robot moves under the control of a commercially provided function of linear motion, and then, control actions of the camera are approximately found by fuzzy-neural networks to follow such desired feature trajectories. To show the validity of the proposed algorithm, some experimental results are illustrated, where a four-axis SCARA robot with a BW CCD camera is used.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call