Abstract

.Purpose: The purpose of this work was to develop a new method of tracking a laparoscopic ultrasound (LUS) transducer in laparoscopic video by combining the hardware [e.g., electromagnetic (EM)] and the computer vision-based (e.g., ArUco) tracking methods.Approach: We developed a special tracking mount for the imaging tip of the LUS transducer. The mount incorporated an EM sensor and an ArUco pattern registered to it. The hybrid method used ArUco tracking for ArUco-success frames (i.e., frames where ArUco succeeds in detecting the pattern) and used corrected EM tracking for the ArUco-failure frames. The corrected EM tracking result was obtained by applying correction matrices to the original EM tracking result. The correction matrices were calculated in previous ArUco-success frames by comparing the ArUco result and the original EM tracking result.Results: We performed phantom and animal studies to evaluate the performance of our hybrid tracking method. The corrected EM tracking results showed significant improvements over the original EM tracking results. In the animal study, 59.2% frames were ArUco-success frames. For the ArUco-failure frames, mean reprojection errors for the original EM tracking method and for the corrected EM tracking method were 30.8 pixel and 10.3 pixel, respectively.Conclusions: The new hybrid method is more reliable than using ArUco tracking alone and more accurate and practical than using EM tracking alone for tracking the LUS transducer in the laparoscope camera image. The proposed method has the potential to significantly improve tracking performance for LUS-based augmented reality applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.