Abstract

Augmented reality (AR) continues to be heavily studied as a research topic for potential medical use. The goal of seeing the patient’s anatomy below the surface of the human body has always been thought of as the ideal surgical navigation tool. Rather than observing medical imaging, such as computed tomography (CT) or magnetic Resonance (MR) images on a monitor, hospital personnel would be able to see patient specific pathologies through Augmented Reality (AR) glasses. Neurosurgery has commonly been a field of choice for AR integration because of the many needs that can potentially be met. Understanding AR in the neurosurgical Operating Room (OR) does pose some benefits well as concern in terms of human computer interaction (HCI). One of the core concepts of HCI is the idea of user-centered design. While one aims to create an intuitive interface for the user-group, introducing AR into the OR can increase cognitive overload and inattentional blindness if executed improperly without considering the full use-case and all stakeholders. A common application of neuro-navigation is in spinal surgery, which, while incredibly accurate, disrupts OR workflow. These devices drastically improve patient outcomes yet are seldom employed because of these disruptions. HCI concepts can better integrate AR into the OR to solve pitfalls observed in modern neuro-navigation, and gives designers, engineers and surgeons the necessary tools to develop AR solutions. Our goal is to thoroughly analyze the OR workflow such that AR can be effectively incorporated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call