Abstract

The authors present a vision module which is able to guide an eye-in-hand robot through general servoing and tracking problems using off-the-shelf image-processing equipment. The vision module uses the location of binary image features from a camera on the robot's end-effector to control the position and one degree of orientation of the robot manipulator. A unique feature-based trajectory generator provides smooth motion between the actual image features and the desired image features even with asynchronous and discontinuous vision updates. By performing the trajectory generation in image feature space, image-processing constraints such as the feature extraction time can be accounted for when determining the appropriate segmentation and acceleration times of the trajectory. Experimental results of a PUMA robot tracking objects with vision feedback are discussed. >

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call