Abstract

We present a novel augmented reality (AR) surgical navigation method with ultrasound-assisted point cloud registration for percutaneous ablation of liver tumors. A preliminary study is carried out to verify its feasibility. Two three-dimensional (3D) point clouds of the liver surface are derived from the preoperative images and intraoperative tracked US images, respectively. To compensate for the soft tissue deformation, the point cloud registration between the preoperative images and the liver is performed using the non-rigid iterative closest point (ICP) algorithm. A 3D AR device based on integral videography technology is designed to accurately display naked-eye 3D images for surgical navigation. Based on the above registration, naked-eye 3D images of the liver surface, planning path, entry points, and tumor can be overlaid in situ through our 3D AR device. Finally, the AR-guided targeting accuracy is evaluated through entry point positioning. Experiments on both the liver phantom and in vitro pork liver were conducted. Several entry points on the liver surface were used to evaluate the targeting accuracy. The preliminary validation on the liver phantom showed average entry-point errors (EPEs) of 2.34 ± 0.45mm, 2.25 ± 0.72mm, 2.71 ± 0.82mm, and 2.50 ± 1.11mm at distinct US point cloud coverage rates of 100%, 75%, 50%, and 25%, respectively. The average EPEs of the deformed pork liver were 4.49 ± 1.88mm and 5.02 ± 2.03mm at the coverage rates of 100% and 75%, and the average covered-entry-point errors (CEPEs) were 4.96 ± 2.05mm and 2.97 ± 1.37mm at 50% and 25%, respectively. Experimental outcomes demonstrate that the proposed AR navigation method based on US-assisted point cloud registration has achieved an acceptable targeting accuracy on the liver surface even in the case of liver deformation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call