Abstract

With recent advanced technologies, various automated diagnosis tools were developed to prevent retinal diseases. The automatic segmentation of blood vessels can help detect various retinal diseases and also assist in reducing doctors’ workload. In existing, numerous techniques have been established to segment RBV automatically. But, they failed to provide better accuracy because of higher computational complexity and lower efficiency. Thus, the proposed work introduces an effective hybrid deep learning technique for RBV segmentation and classification of vessels. Initially, the retinal images are pre-processed to enhance the image quality by performing two steps such as image cropping and colour channel conversion. From the pre-processed images, the most important regions are segmented through a new Enhanced Fuzzy C-Means (EFCM) clustering scheme. During the segmentation process, the retinal images are clustered according to the thickness of the blood vessels, which helps minimize the computational complexity. After segmentation, a hybrid deep learning technique like DenseNet and ShuffleNet is introduced to perform feature extraction and classification. The experimental setup is done by the Python platform using the databases like DRIVE (Digital Retinal Images for Vessel Extraction), STARE (STructured Analysis of the Retina), and HRF (High Resolution fundus). Using the DRIVE dataset, the proposed model achieves an accuracy of 99%, the attained accuracy value of the STARE dataset is 98%, and the HRF dataset obtains an accuracy of 98%. The result analysis proves that the proposed hybrid deep learning technique is more efficient than the state-of-the-art techniques.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.