Abstract

In the case of Minimally Invasive Surgery (MIS), the surgeon reaches the internal organs through small skin incisions, and the operating area is visualized by an endoscopic camera. MIS can be traditional (manually performed) or Robot-Assisted (RAMIS). While, the basics of these techniques are similar, the used instruments and endoscopic cameras can be significantly different. Semantic surgical tool segmentation in endoscopic images can be an important step toward pose estimation, task automation and skill assessment in MIS operations. The goal of automated skill assessment solutions is to replace the time-consuming experts’ opinion-based assessment techniques. The most used dataset for skill assessment is JIGSAWS that incorporates video and kinematic data. Tool segmentation in this dataset is challenged by different illumination conditions, low resolution, lack of ground truth labelling and the different background, while the usual training images are made in front of organs. In this work, Deep Neural Network and traditional image processing solutions were examined, aiming to segment the surgical tools to derive information for automated technical skill assessment in the case of RAMIS. We tested four different Deep Neural Network architectures (UNet, TernausNet-11, TernausNet-16, Linknet-34). First, pre-trained models were examined then we trained these models with JIGSAWS dataset as well. The best overall result was achieved with TernausNet-11 trained on JIGSAWS with Intersection over Union (IoU)= 70. 96, Dice Coefficient = 79. 91 and Accuracy = 97. 3S. But Unet and LinkNet34 could also achieve good results on videos of specific surgical tasks. Moreover, an efficient ground truth labelling method was proposed for the JIGSAWS dataset with the help of the Optical Flow algorithm. The ground truth dataset and source codes are publicly available on Github (https://github.conydorapapp96/SurgToo1SegJIGSAWS.git).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.