Abstract

The different pathways between the position of a near-infrared camera and the user’s eye limit the use of existing near-infrared fluorescence imaging systems for tumor margin assessments. By utilizing an optical system that precisely matches the near-infrared fluorescence image and the optical path of visible light, we developed an augmented reality (AR)-based fluorescence imaging system that provides users with a fluorescence image that matches the real-field, without requiring any additional algorithms. Commercial smart glasses, dichroic beam splitters, mirrors, and custom near-infrared cameras were employed to develop the proposed system, and each mount was designed and utilized. After its performance was assessed in the laboratory, preclinical experiments involving tumor detection and lung lobectomy in mice and rabbits by using indocyanine green (ICG) were conducted. The results showed that the proposed system provided a stable image of fluorescence that matched the actual site. In addition, preclinical experiments confirmed that the proposed system could be used to detect tumors using ICG and evaluate lung lobectomies. The AR-based intraoperative smart goggle system could detect fluorescence images for tumor margin assessments in animal models, without disrupting the surgical workflow in an operating room. Additionally, it was confirmed that, even when the system itself was distorted when worn, the fluorescence image consistently matched the actual site.

Highlights

  • During cancer surgery, preoperative imaging techniques such as computed tomography (CT) and positron emission tomography (PET) have a meaningful impact on preoperative planning, the surgeon’s eyes and hands remain the decisive factors [1,2,3]

  • We developed an augmented reality-based fluorescence imaging (ARFI) goggle system that synchronizes the user’s view and the NIR camera’s view, in order to decrease related costs and improve the efficiency of the system

  • As the real-time projected image is overlaid on the actual field, users can obtain useful information, including the margin of cancer, in Fluorescence images acquired from the NIR cameras were delivered to a computer through a USB 3.0 port and subjected to post-processing

Read more

Summary

Introduction

Preoperative imaging techniques such as computed tomography (CT) and positron emission tomography (PET) have a meaningful impact on preoperative planning, the surgeon’s eyes and hands remain the decisive factors [1,2,3]. In previous studies, our group has employed ICG for applications such as sentinel lymph node (SLN) detection [17,18,19], assessment of lung segments [14], gastric conduit perfusion [20], and lung cancer detection [10] As these systems display information on a remote monitor, surgeons are required to look at the monitor in order to identify the NIR fluorescence image [21,22,23]. Optical seethrough HMDs are based on augmented reality (AR); in these devices, see-through displays are used to project images directly in the user’s field-of-view (FOV) [27,28,29], creating an AR environment [26,30] Using this system, the user can view both the projected image and the object itself. To assess its clinical applicability, the developed system was tested through preclinical experiments using a cancer model for guiding tumor resection in mice and rabbits

Hardware Design for the Arfi System
In Vivo Animal Studies
Mouse Subcutaneous Tumor Model
Rabbit Lung Cancer Model
System Evaluation
Tumor Detection Using Arfi System in Mouse Tumor Model
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call