Abstract

Accurate positioning of the flexible endoscope during clinical procedures is critical for the visualization and success of surgical interventions. However, manual operation of the endoscope is challenging, especially for less-experienced users, due to visual reasoning and hand-eye coordination requirements. To achieve automatic steering of the endoscope, this work proposes a closed-loop robust control method that allows the endoscope to smoothly travel along the upper respiratory tract. First, an attention neural network is introduced to segment the lumen region in endoscopic images, which indicates the moving direction of the endoscope. The segmentation results are optimized by convolutional conditional random field (ConvCRF). In addition, to steer the flexible endoscope, an active disturbance rejection control (ADRC) strategy is proposed to adjust the orientation of the endoscope tip. An external state observer (ESO) is designed to estimate environmental disturbances, enabling the controller to compensate for the positioning error. Finally, the segmentation method achieved the mIOU, accuracy, and Dice values of 85.4%, 97.5%, and 91.4%, respectively, outperforming state-of-the-art methods. The performance of the controller was validated on a real robot showing the ability to automatically steer the flexible endoscope along the upper respiratory tract.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call