The objective of this study is to review the accuracy of an augmented reality navigational guidance system designed to facilitate improved visualization, guidance, and accuracy during percutaneous needle-based procedures including biopsies and ablations. Using the HoloLens 2, the system registers and projects 3D CT-based models of segmented anatomy along with live ultrasound, fused with electromagnetically tracked instruments including ultrasound probes and needles, giving the operator comprehensive stereoscopic visualization for intraoperative planning and navigation during procedures.Tracked needles were guided to targets implanted in a cadaveric model using the system. Image fusion registration error, the multimodality error measured as the post-registration distance between a corresponding point measured in the stereoscopic CT and tracked ultrasound coordinate systems, and target registration error, the Euclidean distance between needle tip and target after needle placement, were measured as registration and targeting accuracy metrics. A t-distribution was used for statistical analysis. Three operators performed 36 total needle passes, 18 to measure image fusion registration error and 18 to measure target registration error on four targets. The average depth of each needle pass was 8.4cm from skin to target center. Mean IFRE was 4.4mm (: , ). Mean TRE was 2.3mm (: , ). The study demonstrated high registration and targeting accuracy of this AR navigational guidance system in percutaneous, needle-based procedures. This suggests the ability to facilitate improved clinical performance in percutaneous procedures such as ablations and biopsies.
Read full abstract