Abstract

Colorectal cancer, also known as rectal cancer, is one of the most common forms of cancer, and it can be completely cured with early diagnosis. The most effective and objective method of screening and diagnosis is colonoscopy. Polyp segmentation plays a crucial role in the diagnosis and treatment of diseases related to the digestive system, providing doctors with detailed auxiliary boundary information during clinical analysis. To this end, we propose a novel light-weight feature refining and context-guided network (FRCNet) for real-time polyp segmentation. In this method, we first employed the enhanced context-calibrated module to extract the most discriminative features by developing long-range spatial dependence through a context-calibrated operation. This operation is helpful to alleviate the interference of background noise and effectively distinguish the target polyps from the background. Furthermore, we designed the progressive context-aware fusion module to dynamically capture multi-scale polyps by collecting multi-range context information. Finally, the multi-scale pyramid aggregation module was used to learn more representative features, and these features were fused to refine the segmented results. Extensive experiments on the Kvasir, ClinicDB, ColonDB, ETIS, and Endoscene datasets demonstrated the effectiveness of the proposed model. Specifically, FRCNet achieves an mIoU of 84.9% and mDice score of 91.5% on the Kvasir dataset with a model size of only 0.78 M parameters, outperforming state-of-the-art methods. Models and codes are available at the footnote. 1

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.