Abstract

In minimally invasive surgery, deployment of motion compensation, dynamic active constraints and adaptive intra-operative guidance require accurate estimation of deforming tissue in 3D. To this end, the use of vision-based techniques is advantageous in that it does not require the integration of additional hardware to the existing surgical settings. Deformation can be recovered by tracking features on the surface of the tissue. Existing methods are mostly based on ad hoc machine vision techniques that have generally been developed for rigid scenes or use pre-determined models with parameters fine tuned to specific surgical settings. In this work, we propose a novel tracking technique based on a context specific feature descriptor. The descriptor can adapt to its surroundings and identify the most discriminate information for tracking. The feature descriptor is represented as a decision tree and the tracking process is formulated as a classification problem for which log likelihood ratios are used to improve classifier training. A recursive tracking algorithm obtains examples of tissue deformation used to train the classifier. Additional training data is generated by geometric and appearance modelling. Experimental results have shown that the proposed context specific descriptor is robust to drift, occlusion, and changes in orientation and scale. The performance of the algorithm is compared with existing tracking algorithms and validated with both simulated and in vivo datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.