Abstract

A practical CTR kinematics model must consider torsion, shear, friction, interactive force and nonlinear constitutive behavior, which makes the model extremely complicated and computationally expensive. This paper proposes a model-free and uncalibrated approach to conduct eye-in-hand visual servoing (EiH-VS) for concentric-tube robots (CTRs) in minimally-invasive surgery. An adaptive controller is designed for fast convergence. We propose to numerically calculate an image Jacobian to map the image deviations to robot joint variables. The image Jacobian is online estimated based on unscented Kalman filter (UKF) without the prior knowledge of CTR kinematic model or hand-eye calibration. A customized observation function is constructed to describe the mapping relationship between state and measurement vectors. The results of simulations and experiments validate the efficiency and superiority of the proposed methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call