Abstract
The purpose of this study was to develop a deep learning-based fully automated reconstruction and quantification algorithm which automatically delineates the neurites and somas of retinal ganglion cells (RGCs). We trained a deep learning-based multi-task image segmentation model, RGC-Net, that automatically segments the neurites and somas in RGC images. A total of 166 RGC scans with manual annotations from human experts were used to develop this model, whereas 132 scans were used for training, and the remaining 34 scans were reserved as testing data. Post-processing techniques removed speckles or dead cells in soma segmentation results to further improve the robustness of the model. Quantification analyses were also conducted to compare five different metrics obtained by our automated algorithm and manual annotations. Quantitatively, our segmentation model achieves average foreground accuracy, background accuracy, overall accuracy, and dice similarity coefficient of 0.692, 0.999, 0.997, and 0.691 for the neurite segmentation task, and 0.865, 0.999, 0.997, and 0.850 for the soma segmentation task, respectively. The experimental results demonstrate that RGC-Net can accurately and reliably reconstruct neurites and somas in RGC images. We also demonstrate our algorithm is comparable to human manually curated annotations in quantification analyses. Our deep learning model provides a new tool that can trace and analyze the RGC neurites and somas efficiently and faster than manual analysis.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.