Robotic telemedicine can provide timely treatment to critical patients in geographically remote locations. However, owing to the lack of depth information, occlusion of instrument, and view direction limitations, visual feedback from the patient to the clinician is typically unintuitive, which affects surgical safety. Herein, an omnidirectional augmented reality (AR)‐assisted robotic telepresence for interventional medicine is developed. A monocular camera is used as an AR device with a manipulator, where virtual key anatomies and instruments are superimposed on video images using multiobject hand–eye calibration, iterative closest point, and images superposition algorithms. The view direction of the camera can be changed via the manipulator, which allows the key anatomies and instruments to be viewed from different directions. Structure‐from‐motion and multi‐view stereo algorithms are used to reconstruct the scene on the patient's side, thus providing a virtual reality (VR) interactive interface for the manipulator's safe teleoperation. Two different phantom experiments are conducted to validate the effectiveness of the proposed method. Finally, an ex vivo experiment involving a porcine lumen is performed, in which the operator can view the interventional flexible robot. The proposed method integrates omnidirectional AR and VR with robotic telepresence, which can provide intuitive visual feedback to clinicians.
Read full abstract