Abstract

Abstract Navigation systems play an increasingly important role in minimally invasive surgery (MIS) by mitigating the problems rising from the decoupling of hand-eye movement of the surgeon. Many of these systems suffer from a high dependency on external optical tracking systems that require a constant line-of-sight to the optical markers being tracked. Simultaneous localization and mapping (SLAM) algorithms allow tracking the endoscope in cases where optical tracking fails due to the cluttered environment in the operating room. To ensure a correct camera pose estimate and to correct for drift, a recognition of previously visited locations (loop closures) is essential. We propose a method for location recognition in a minimally invasive scenario that only requires a stereo endoscope and an inertial measurement unit (IMU).We use a hierarchical bag-of-visual-words (BoW) algorithm that saves compact image representations and enables querying for matching images. A two-staged consistency check using a random sample consensus (RANSAC) and the data measured by the IMU ensure a high matching precision.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call