Abstract

Background and objectiveRetinal image quality assessment is an essential task for the diagnosis of retinal diseases. Recently, there are emerging deep models to grade quality of retinal images. However, current models either directly transfer classification networks originally designed for natural images to quality classification of retinal images or introduce extra image quality priors via multiple CNN branches or independent CNNs. The purpose of this work is to address retinal image quality assessment by a simple deep model. MethodsWe propose a dark and bright channel prior guided deep network for retinal image quality assessment named GuidedNet. It introduces dark and bright channel priors into deep network without extra parameters increasing and allows for training end-to-end. In detail, the dark and bright channel priors are embedded into the start layer of a deep network to improve the discriminate ability of deep features. Moreover, we re-annotate a new retinal image quality dataset called RIQA-RFMiD for further validation. ResultsThe proposed method is evaluated on a public retinal image quality dataset Eye-Quality and our re-annotated dataset RIQA-RFMiD. We obtain the average F-score of 88.03% on Eye-Quality and 66.13% on RIQA-RFMiD, respectively. ConclusionsWe investigate the utility of the dark and bright channel priors for retinal image quality assessment. And we propose a GuidedNet by embedding the dark and bright channel priors into CNNs without much model burden. Moreover, to valid the GuidedNet, we re-create a new dataset RIQA-RFMiD. With the GuidedNet, we achieves state-of-the-art performances on a public dataset Eye-Quality and our re-annotated dataset RIQA-RFMiD.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.