Robotic ophthalmic endoscope holders allow surgeons to execute dual-hand operations in eye surgery. To prevent needle-like endoscopes from invading the retina when moving, surgeons expect visual and real-time information about the relative special relationship between the endoscope and fundus. This study develops a real-time fundus reconstruction method. First, using deep learning, the method estimates the distance between the fundus part corresponding to every pixel of the RGB endoscopic image and the endoscope. Then, by combining the estimated distance with the kinematics of a robotic holder, the point cloud representing the present fundus area is generated, and by which the size and position of the eyeball are estimated. This method shows a real-time frequency of 10Hz, which is robust to eyeball movement. The error of fundus reconstruction is about 0.5mm, and the error of eyeball estimation is about 1mm. Using this fundus reconstruction method can map the position of the endoscope inside the eyeball when using a robotic endoscope holder in eye surgery. The overall accuracy level meets the ophthalmologists' accuracy requirements of ophthalmologists.
Read full abstract