Abstract
Accurate skin lesion segmentation in dermoscopic images is crucially important for the early diagnosis of skin cancer. Nevertheless, the existence of complex interfering factors, such as natural and artificial artifacts (e.g., hair and air bubbles), irregular appearance (e.g., varied shapes and low contrast), and significant differences in color shades cause the segmentation of skin lesion challenging. In this study, we propose a superpixel-guided generative adversarial network (GAN) with dual-stream patch-based discriminators for segmentation of the skin lesion. Specifically, in the designed GAN, a new multi-scale context extraction module (MCEM) is designed in the mask generator to enrich contextual information and capture boundary features of the lesion region, and thus accurately locating the object boundary of the lesion. Meantime, another branch of superpixel guided discriminator is designed and added into the discriminator module, which supplies so more compact and semantic information that enhances the discriminative ability. By using the dual-branch patch-based discriminators, the fine-grained discriminative power to local segmentation details is further enhanced and in turn making the generator produce more accurate segmentation masks. Comprehensive experiments show that our presented model achieves significant segmentation performance on the ISIC2016, ISIC2018, and HAM10000 skin lesion challenge datasets, and outperforms several promising deep convolutional neural networks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.