Abstract

Surgical tool detection is a recently active research area. It is the foundation to a series of advanced surgical support functions, such as image guided surgical navigation, forming safety zone between surgical tools and sensitive tissues. Previous methods rely on two types of information: tool locating signals and vision features. Collecting tool locating signals requires additional hardware equipments. Vision based methods train their detection models using strong annotations (e.g. bounding boxes), which are quite rare and expensive to acquire in the field of surgical image understanding. In this paper, we propose a Pseudo Supervised surgical Tool detection (PSTD) framework, which performs explicit detection refinement by three levels of associated measures (pseudo bounding box generation, real box regression, weighted boxes fusion) in a weakly supervised manner. On the basis of PSTD, we develop a Bi-directional Adaption Weighting (BAW) mechanism in our tool classifier for contextual information mining by creating competition or cooperation relationships between channels. By only using image-level tool category labels, the proposed method yields state-of-the-art results with 87.0% mAP on a mainstream surgical image dataset: Cheloc80.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.