Abstract

Zebrafish (Danio rerio) eyes are widely used in modeling studies of human ophthalmic diseases, including glaucoma and myopia. These pathologies cause morphological variations in the anterior chamber elements, which can be quantitatively measured using morphometric parameters, such as the corneal curvature, central corneal thickness, and anterior chamber angle. In the present work, an automated method is presented for iris and corneal segmentation, as well as the determination of the above-mentioned morphometry from optical coherence tomography (OCT) scans of zebrafish. The proposed method consists of four stages; namely, preprocessing, segmentation, postprocessing, and extraction of morphometric parameters. The first stage is composed of a combination of wavelet and Fourier transforms as well as gamma correction for artifact removal/reduction. The segmentation step is achieved using the U-net convolutional neural network. The postprocessing stage is composed of multilevel thresholding and morphological operations. Finally, three algorithms are proposed for automated morphological extraction in the last step. The morphology obtained using our automated framework is compared against manual measurements to assess the effectiveness of the method. The obtained results show that our scheme allows reliable determination of the morphometric parameters, thereby allowing efficient assessment for massive studies on zebrafish anterior chamber morphology using OCT scans.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.