Abstract

Visual servoing has been widely employed in robotic control to increase the flexibility and precision of a robotic arm. When the end-effector of the robotic arm needs to be moved to a spatial point without a coordinate, the conventional visual servoing control method has difficulty performing the task. The present work describes space constraint challenges in a visual servoing system by introducing an assembly node and then presents a two-stage visual servoing control approach based on perspective transformation. A virtual image plane is constructed using a calibration-derived homography matrix. The assembly node, as well as other objects, are projected into the plane after that. Second, the controller drives the robotic arm by tracking the projections in the virtual image plane and adjusting the position and attitude of the workpiece accordingly. Three simple image features are combined into a composite image feature, and an active disturbance rejection controller (ADRC) is established to improve the robotic arm’s motion sensitivity. Real-time simulations and experiments employing a robotic vision system with an eye-to-hand configuration are used to validate the effectiveness of the presented method. The results show that the robotic arm can move the workpiece to the desired position without using coordinates.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call