Abstract

Recognizing surgical tasks is a crucial step toward automatic surgical training in robotic surgery training. In this work, we proposed and developed a classification framework for surgical task recognition. This approach is based on using three components: Dynamic Time Warping (DTW), Procrustes analysis (PA), and Fuzzy k- nearest neighbor (FkNN). First, the DTW method processes multi-channel motion trajectories with different lengths by stretching and compressing both signals such that their lengths become identical. Second, Procrustes analysis is used as a distance measure between two sequences based on shape similarity transformations: rotations, reflection, scaling, and translation. Finally, a Fuzzy k-nearest neighbor algorithm is applied to distinguish between different tasks by assigning a fuzzy class membership based on their distances. We evaluated our framework on a real raw kinematic surgical robotic dataset. Then, we validated the proposed model using Leave One Supertrial Out (LOSO) and Leave One User Out (LOUO) cross-validation schemes. Our results show improvements in the classification of the three different Robot-assisted minimally invasive surgery (RMIS) tasks: suturing, needle-passing, and knot-tying.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.