Abstract

This paper describes a method to track camera motion of a real endoscope by using epipolar geometry analysis and CT derived virtual endoscopic images. A navigation system for a flexible endoscope guides medical doctors by providing navigation information during endoscope examinations. This paper tries to estimate the motion from an endoscopic video image based on epipolar geometry analysis and image registration between virtual endoscopic (VE) and real endoscopic (RE) images. The method consists of three parts: (a) direct estimation of camera motion by using epipolar geometry analysis, (b) precise estimation by using image registration, and (c) detection of bubble frames for avoiding miss-registration. First we calculate optical flow patterns from two consecutive frames. The camera motion is computed by substituting the obtained flows into the epipolar equations. Then we find the observation parameter of a virtual endoscopy system that generates the most similar endoscopic view to the current RE frame. We execute these processes for all frames of RE videos except for frames where bubbles appear. We applied the proposed method to RE videos of three patients who have CT images. The experimental results show the method can track camera motion for over 500 frames continuously in the best case.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call