Abstract

This paper presents a novel surgical navigation system based on visual tracking algorithm, where the 3D position of surgical instrument and patient will be determined from 3D visual sensor and displayed by computer graphic method. Different from traditional systems based on ultrasound and motion capture, our system used a Kernel-Reliability K-Means (KRKM) clustering algorithm as our targets tracking algorithm. By applying the KRKM tracker to the RGB image obtained from the Realsense camera, the system can locate the pixel coordinates of colour markers on the surgical instruments and those on the patients' body. According to these coordinates, our system can obtain their 3D coordinates from the depth images created from Realsense camera. Then, we proposed a stepwise coordinate fusing algorithm to convert the coordinates of camera, surgical instrument and patient into a new uniform one which is centered at patient body. It can simplify the calculation of all coordinates systems of our system. At last, according to the real-time tracking result of the surgical instruments in the real 3D space, a models by using openGL and openMesh library in virtual 3D space to display location and orientation information can be built. The experimental results show that surgical navigation system based on the visual object tracking algorithm can satisfy the requirements of accurate, stable and real-time for surgical instrument tracking and patient positioning.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call