Abstract
We present a novel real-time technique for dynamic localization of the needle tip in 2D ultrasound during challenging interventions in which the tip is imperceptible or shaft information is unavailable. We first enhance the needle tip from time-series ultrasound data through digital subtraction of consecutive frames. The enhanced tip image is then fed to a cascade of similar convolutional neural networks: a tip classifier and a tip location regressor. The classifier ascertains tip motion and the regressor directly outputs the coordinates of the tip. Since we do not require needle shaft information, the method achieves efficient localization of both in-plane and out-of-plane needles. Our approach is trained and evaluated on an ex vivo dataset collected using two different ultrasound machines, with in-plane and out-of-plane insertion of 17G and 22G needles in bovine, porcine and chicken tissue. We use 12, 000 frames extracted from 40 video sequences for training and validation, and 500 frames from 20 sequences as test data. The framework achieves a tip localization error of \(0.55\,\pm \,0.07\) mm, and overall processing time of 0.015 s (67 fps). Validation studies against state-of-the-art achieved \(29\%\) and \(509\%\) improvement in accuracy and processing rate respectively. Because of the real-time execution time and accurate tip localization, we believe that our approach is potentially a breakthrough for real-time needle tip localization in challenging ultrasound-guided interventions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.