Abstract

The challenge in addressing uncalibrated visual servoing (VS) control of robot manipulators with unstructured environments is to obtain appropriate interaction matrix and keep the image features in the field of view (FOV), especially when the non-Gaussian noise disturbance exists in the VS process. In this article, a hybrid control algorithm which combines bidirectional extreme learning machine (B-ELM) with smooth variable structure filter (SVSF) is proposed to estimate interaction matrix and tackle visibility constraints. For VS, the nonlinear mapping between image features and interaction matrix is approximated using the B-ELM learning. To increase the capability of anti-interference, the SVSF is employed to re-estimate interaction matrix. A constraint function presenting feature coordinates and region boundaries is given and added to the velocity controller, which drags image features away from the restricted region and ensures the smoothness of the velocities. Since the camera and robot model parameters are not required in developing the control strategy, the servoing task can be fulfilled flexibly and simply. Simulation and experimental results on a conventional 6-degree-of-freedom manipulator verify the effectiveness of the proposed method.

Highlights

  • Visual feedback signals have been used as significant information in robots to tackle the positioning or motion control in unstructured environments

  • According to the visual feedback signal returned by 3D Cartesian space coordinate or image plane coordinate, it can be divided into position-based visual servoing (PBVS), image-based visual servoing (IBVS) and hybrid visual servoing control systems [1]–[3]

  • A method combined bidirectional extreme learning machine (B-ELM) algorithm with smooth variable structure filter (SVSF) algorithm was proposed to map the nonlinearity between the visual space of the robot manipulator and the motion space of the end-effector, and the field of view (FOV) constraint was adopted to ensure that no feature points will be lost during the whole motion

Read more

Summary

INTRODUCTION

Visual feedback signals have been used as significant information in robots to tackle the positioning or motion control in unstructured environments. In IBVS, the analytical form of the image Jacobian matrix is obtained in the control loop which requires accurate calibration of camera model parameters. An uncalibrated IBVS, which does not require scene model or camera/robot calibration, applies image feature information (e.g., point feature, line feature and image moment) to estimate the unknown system dynamics, and the controller is designed based on the identified Jacobian matrix [4], [5]. A method combined bidirectional extreme learning machine (B-ELM) algorithm with smooth variable structure filter (SVSF) algorithm was proposed to map the nonlinearity between the visual space of the robot manipulator and the motion space of the end-effector, and the FOV constraint was adopted to ensure that no feature points will be lost during the whole motion. Assumption 2: Image feature points observed by camera are coplanar

B-ELM FUNCTION APPROXIMATION METHOD
SMOOTH VARIABLE STRUCTURE FILTER STATE ESTIMATION
VISIBILITY CONSTRAINTS
HYBRID ALGORITHM OF VISUAL SERVOING SYSTEM WITH FEATURE CONSTRAINTS
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call