Abstract

A method to extract and track the position of a guide wire during endovascular interventions under X-ray fluoroscopy is presented and evaluated. The method can be used to improve guide wire visualization in the low quality fluoroscopic images and to estimate the position of the guide wire in world coordinates. A two-step procedure is utilized to track the guide wire in subsequent frames. First a rough estimate of the displacement is obtained using a template matching procedure. Subsequently, the position of the guide wire is determined by fitting a spline to a feature image in which line-like structures are enhanced. In the optimization step, the influence of the scale at which the feature is calculated is investigated. Also, the feature image is calculated both on the original image and on a preprocessed image in which coherent structures are enhanced. Finally, the influence of explicit endpoint detection is studied. The method is evaluated on 267 frames from 10 sequences. Using the automatic method, the guide wire could be tracked in 96% of the frames, with a greater accuracy than three observers. Endpoint detection improved the accuracy of the tip assessment, which was better than 1.3 mm.KeywordsFeature ImageEndovascular InterventionGuide WireMotion BlurEndpoint DetectionThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.