Abstract
Existing surgical navigation approaches of the rod bending procedure in spinal fusion rely on optical tracking systems that determine the location of placed pedicle screws using a hand-held marker. We propose a novel, marker-less surgical navigation proof-of-concept to bending rod implants. Our method combines augmented reality with on-device machine learning to generate and display a virtual template of the optimal rod shape without touching the instrumented anatomy. Performance was evaluated on lumbosacral spine phantoms against a pointer-based navigation benchmark approach and ground truth data obtained from computed tomography. Our method achieved a mean error of 1.83 ± 1.10 mm compared to 1.87 ± 1.31 mm measured in the marker-based approach, while only requiring 21.33 ± 8.80 s as opposed to 36.65 ± 7.49 s attained by the pointer-based method. Our results suggests that the combination of augmented reality and machine learning has the potential to replace conventional pointer-based navigation in the future.
Accepted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have