Abstract
A deep convolution neural network image segmentation model based on a cost-effective active learning mechanism is proposed and named PolySeg Plus. It is intended to address polyp segmentation with a lack of labeled data and a high false-positive rate of polyp discovery. In addition to applying active learning, which assisted in labeling more image samples, a comprehensive polyp dataset formed of five benchmark datasets was generated to increase the number of images. To enhance the captured image features, the locally shared feature method is used, which utilizes the power of employing neighboring features together with one another to improve the quality of image features and overcome the drawbacks of the Conditional Random Features method. Medical image segmentation was performed using ResUNet++, ResUNet, UNet++, and UNet models. Gaussian noise was removed from the images using a gaussian filter, and the images were then augmented before being fed into the models. In addition to optimizing model performance through hyperparameter tuning, grid search is used to select the optimum parameters to maximize model performance. The results demonstrated a significant improvement and applicability of the proposed method in polyp segmentation when compared to state-of-the-art methods on the datasets CVC-ClinicDB, CVC-ColonDB, ETIS Larib Polyp DB, KVASIR-SEG, and Kvasir-Sessile, with Dice coefficients of 0.9558, 0.8947, 0.7547, 0.9476, and 0.6023, respectively. Not only did the suggested method improve the dice coefficients on the individual datasets, but it also produced better results on the comprehensive dataset, which will contribute to the development of computer-aided diagnosis systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Computational Intelligence Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.