Abstract

PurposeTo develop a neural network (NN)–based approach, with limited training resources, that identifies and counts the number of retinal pigment epithelium (RPE) cells in confocal microscopy images obtained from cell culture or mice RPE/choroid flat-mounts.MethodsTraining and testing dataset contained two image types: wild-type mice RPE/choroid flat-mounts and ARPE 19 cells, stained for Rhodamine-phalloidin, and imaged with confocal microscopy. After image preprocessing for denoising and contrast adjustment, scale-invariant feature transform descriptors were used for feature extraction. Training labels were derived from cells in the original training images, annotated and converted to Gaussian density maps. NNs were trained using the set of training input features, such that the obtained NN models accurately predicted corresponding Gaussian density maps and thus accurately identifies/counts the cells in any such image.ResultsTraining and testing datasets contained 229 images from ARPE19 and 85 images from RPE/choroid flat-mounts. Within two data sets, 30% and 10% of the images, were selected for validation. We achieved 96.48% ± 6.56% and 96.88% ± 3.68% accuracy (95% CI), on ARPE19 and RPE/choroid flat-mounts.ConclusionsWe developed an NN-based approach that can accurately estimate the number of RPE cells contained in confocal images. Our method achieved high accuracy with limited training images, proved that it can be effectively used on images with unclear and curvy boundaries, and outperformed existing relevant methods by decreasing prediction error and variance.Translational RelevanceThis approach allows efficient and effective characterization of RPE pathology and furthermore allows the assessment of novel therapeutics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.