Today, deep learning algorithms are playing a very crucial role in the initial stage diagnosis of several fundus diseases like glaucoma, hypertension, and diabetic retinopathy. Lots of research is going on in this area now-a-days. During the acquisition of fundus images some artificial spots (e.g. because of the device itself, dust particles in the surroundings) have been added in captured images. In this paper, artificial spots in the fundus images that are generated due to the non-standardized conditions of scanning devices, are detected with the help of a newly proposed modified UNet (mU-Net) semantic segmentation model. Initially, preprocessing methods such as Gaussian blur, thresholding, and Hough transform have been used to create artificial spots. Now these preprocessed images have been used for training the proposed model. To make the proposed model more effective, the following modifications like, Regularization techniques (early stopping, greater weight decay, and Adam optimizer), decay learning rate scheduler, categorical cross-entropy loss function, and a significant number of filters have been modified in simple U-Net model. Apart from these mentioned modifications in the base U-Net model, a feature injecting module (FIM) has been added between the expansion and the contraction section of the simple U-Net model. FIM adds the features of the input image at the time of up-sampling. The addition of FIM to a simple U-Net model improves the detection of artificial spots and enhances the performance of the model. The mU-Net has been compared with other models, namely simple U-Net, V-Net, UNet++, ResUnet-a, WideU-Net, and Swin-Unet. The Friedman test that has been conducted on IOU, DICE, MAE, PSNR, and SSIM scores, found that mU-Net balances evaluation metrics well. It appears that the nonparametric Friedman test will improve reproducibility by demonstrating statistical significance. The IOU, DICE, MAE, PSNR, and SSIM scores of the proposed model indicate superior performance as compared to other models.
Read full abstract