Abstract
.Significance: Subretinal injection is an effective way of delivering transplant genes and cells to treat many degenerative retinal diseases. However, the technique requires high-dexterity and microscale precision of experienced surgeons, who have to overcome the physiological hand tremor and limited visualization of the subretinal space.Aim: To automatically guide the axial motion of microsurgical tools (i.e., a subretinal injector) with microscale precision in real time using a fiber-optic common-path swept-source optical coherence tomography distal sensor.Approach: We propose, implement, and study real-time retinal boundary tracking of A-scan optical coherence tomography (OCT) images using a convolutional neural network (CNN) for automatic depth targeting of a selected retinal boundary for accurate subretinal injection guidance. A simplified 1D U-net is used for the retinal layer segmentation on A-scan OCT images. A Kalman filter, combining retinal boundary position measurement by CNN and velocity measurement by cross correlation between consecutive A-scan images, is applied to optimally estimate the retinal boundary position. Unwanted axial motions of the surgical tools are compensated by a piezoelectric linear motor based on the retinal boundary tracking.Results: CNN-based segmentation on A-scan OCT images achieves the mean unsigned error (MUE) of () using an ex vivo bovine retina model. GPU parallel computing allows real-time inference () and thus real-time retinal boundary tracking. Involuntary tremors, which include low-frequency draft in hundreds of micrometers and physiological tremors in tens of micrometers, are compensated effectively. The standard deviations of photoreceptor (PR) and choroid (CH) boundary positions get as low as when the depth targeting is activated.Conclusions: A CNN-based common-path OCT distal sensor successfully tracks retinal boundaries, especially the PR/CH boundary for subretinal injection, and automatically guides the tooltip’s axial position in real time. The microscale depth targeting accuracy of our system shows its promising possibility for clinical application.
Highlights
Subretinal injection is becoming increasingly prevalent in both scientific research and clinical communities as an efficient way of treating retinal diseases
Lee and Kang: convolutional neural network (CNN)-based CP-Optical coherence tomography (OCT) sensor integrated with a subretinal injector for retinal boundary
Surface tracking-based guidance could induce inaccurate depth targeting for subretinal injection because of retinal thickness variations and irregular morphological features caused by retinal diseases
Summary
Subretinal injection is becoming increasingly prevalent in both scientific research and clinical communities as an efficient way of treating retinal diseases. Microscope-integrated OCT systems were applied for surgical tool localization and robotic system guidance by intraoperatively providing volumetric images of tissues and surgical tools.[5,6,7,8,9] Fiber-optic commonpath OCT (CP-OCT) distal sensor integrated hand held surgical devices have been developed to implement simple, compact, and cost-effective microsurgical systems.[10,11,12,13] In those systems, a single-fiber distal sensor attached to a surgical tooltip (i.e., needle or microforceps) guided the hand held surgical device by real-time A-scan-based surface tracking. Convolutional neural network (CNN)-based retinal layer segmentation have been proposed and showed promising results.[21,22,23,24] the proposed CNN-based methods were developed for B-scan or C-scan OCT image segmentation, they could be applied to A-scan images and operate in real time by simplifying networks and using GPU parallel computing
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.