Abstract

Suboptimal interaction with patient data and challenges in mastering 3D anatomy based on ill-posed 2D interventional images are essential concerns in image-guided therapies. Augmented reality (AR) has been introduced in the operating rooms in the last decade; however, in image-guided interventions, it has often only been considered as a visualization device improving traditional workflows. As a consequence, the technology is gaining minimum maturity that it requires to redefine new procedures, user interfaces, and interactions. The main contribution of this paper is to reveal how exemplary workflows are redefined by taking full advantageof head-mounted displays when entirely co-registered with the imaging system at all times. The awareness of the system from the geometric and physical characteristics of X-ray imaging allows the exploration of different human-machine interfaces. Our system achieved an error of 4.76 ± 2.91 mm for placing K-wire in a fracture management procedure, and yielded errors of 1.57 ± 1.16° and 1.46 ± 1.00° in the abduction and anteversion angles, respectively, for total hip arthroplasty (THA). We compared the results with the outcomes from baseline standard operative and non-immersive AR procedures, which had yielded errors of [4.61mm, 4.76°, 4.77°] and [5.13 mm, 1.78°, 1.43°], respectively, for wire placement, and abduction and anteversion during THA. We hope that our holistic approach towards improving the interface of surgery not only augments the surgeon’s capabilities but also augments the surgical team’s experience in carrying out an effective intervention with reduced complications and provide novel approaches of documenting procedures for training purposes.

Highlights

  • I NTERVENTIONAL image guidance is widely adopted across multiple disciplines of minimally-invasive and percutaneous therapies [1]–[4]

  • As mentioned above, during standard operating procedure (SOP), surgeons inserted the K-wires with an average number of 40.86 fluoroscopic images and with an average dose of 4.43 cGY(cm2), compared to the significantly lower dose of 0.255 cGY(cm2), which was emitted during the Augmented reality (AR) procedures

  • It is important to note that, our AR system performed similar to the conventional X-ray method in terms of accuracy, while reducing time by a factor of 5, number of fluoroscopic acquisitions by a factor of 20, and the radiation dose by a factor of 17

Read more

Summary

Introduction

I NTERVENTIONAL image guidance is widely adopted across multiple disciplines of minimally-invasive and percutaneous therapies [1]–[4]. Despite its importance in providing anatomy-level updates, visualization of images and interaction with the intra-operative data are inefficient, requiring extensive experience to properly associate the content of the image with the patient anatomy. These challenges become evident in interventions that require the surgeon to navigate wires and catheters through critical structures under excessive radiation, such as in fracture or endovascular repairs. Surgical navigation and robotic systems are developed to support surgery with localization and execution of welldefined tasks [5]–[8] Though these systems increase the accuracy, their complex setup and explicit tracking nature may overburden the surgical workflow and impede their acceptance in clinical routines [9]. Image-based navigation alleviates the requirements for external tracking, though depends strongly on pre-operative data which become outdated when the anatomy is altered during the surgery [10], [11]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.