Abstract
Background and objectiveAutomatic needle tip detection is important in real-time ultrasound (US) images that are utilized to guide interventional needle puncture procedures in clinical settings. However, due to the spatial indiscernibility problem caused by the severe background interferences and the tip characteristics of small size, being grayscale and indistinctive appearance patterns, tip detection in US images is challenging. MethodsTo achieve precise tip detection in US images against spatial indiscernibility, a novel multi-keyframe motion-aware framework called TipDet is proposed. It can identify tips based on their short-term spatial-temporal pattern and long-term motion pattern. In TipDet, first, an adaptive keyframe model (AKM) is proposed to decide whether a frame is informative to serve as a keyframe for long-term motion pattern learning. Second, candidate tip detection is conducted using a two-stream backbone (TSB) based on their short-term spatial-temporal pattern. Third, to further identify the true one in the candidate tips, a novel method for learning the long-term motion pattern of the tips is proposed based on the proposed optical-flow-aware multi-head cross-attention (OFA-MHCA). ResultsOn the clinical human puncture dataset, which includes 4195 B-mode images, the experimental results show that the proposed TipDet can achieve precise tip detection against the spatial indiscernibility problem, achieving 78.7 % AP0.1:0.5 and 8.9 % improvement over the base detector at approximately 20 FPS. Moreover, a tip localization error of 1.3±0.6 % is achieved, exceeding the existing method. ConclusionsThe proposed TipDet can facilitate a wider and easier application of US-guided interventional procedures by providing robust and precise needle tip localization. The codes and data are available at https://github.com/ResonWang/TipDet.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.