Abstract Objectives Due to the close topographical relationship of functional relevant anatomic structures, limited space and cosmetic aspects orbital surgery will remain a challenging discipline. Therefore, novel technical capabilities are necessary for further surgical progress. We here tested the integration of augmented reality and optical navigation in one workflow for interdisciplinary decision-making, feasibility and intraoperative guidance. Methods High-resolution contrast-enhanced MRI and CT scans were automated and manual-assisted segmented to achieve a detailed three-dimensional (3D) model of the individual patho-anatomical relationships. Augmented reality was used for interdisciplinary preoperative planning and intraoperative intuitive navigation. Mayfield clamp head holder in combination with optical surface matching registration assured navigation assisted microsurgery. Results Combinations of different MRI-sequences and CT-scans were necessary for detailed 3D-modeling. Modeling was time consuming and only viable in the hands of medical, surgical and anatomical trained staff. Augmented reality assured a quick, intuitive interdisciplinary orientation. Intraoperative surface matching registration enabled precise navigation in the orbital space. Conclusions Optical Navigation and microscope integration achieved a straightforward microsurgical workflow and should be implemented regularly. Augmented reality represented a useful tool for preoperative interdisciplinary planning and intraoperative intuitive orientation. It further stated an excellent educational tool.
Read full abstract