Abstract

ABSTRACT Mini-invasive partial nephrectomy (MIPN) has many advantages. However, it is challenging for the surgeon to localise the hidden anatomical structures to be spared or resected during surgery. Augmented reality (AR) is a promising localisation assistance approach. Existing AR-MIPN methods augment the endoscopic view with 3D models from preoperative CT scan. However, they do not track the kidney in real-time, which considerably reduces usability, as AR is only temporarily available on isolated images. We propose an approach to achieve continuous live AR-MIPN. It uses classical camera calibration and manual initial registration. Its key novelty is a keypoint-based automatic kidney tracking module, with three main technical contributions. First, it performs stereo tracking-by-detection from stereo keyframes, exploiting left-right consistency to maximise robustness. Second, it only considers keypoints within the parenchyma, as segmented by a specifically trained neural network. Third, it improves keypoint detection and matching by a new process that we call stereo perspective correction (SPC). It uses the stereo depth-map and surface flattening to generate an image warp that cancels the perspective effect, improving the performance of keypoint detection and matching. We carried out experiments on semi-synthetic and real surgical datasets to compare several tracking methods, showing that our method outperforms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call