Abstract

Background and problems:Automated skin lesion segmentation is a crucial step in the whole computer-aided (CAD) skin disease process. Recently, the fully convolutional network (FCN) has achieved outstanding performance on this task. However, it remains challenging because of three problems: (1) the difficult cases on dermoscopy images, including low contrast lesion, bubble and hair occlusion cases; (2) the overfitting problem of FCN-based methods that is caused by the imbalanced training of difficult samples and easy samples; (3) the over-segmentation problem of FCN-based methods. Method:This work proposes a new skin lesion segmentation framework. Specifically, feature representations from dermoscopy images are learned by the Adaptive Feature Learning Network (AFLN). An ensemble learning method is introduced to build a fusion model, enabling the AFLN model to capture the multi-scale information. We propose a Difficulty-Guided Curriculum Learning (DGCL) with step-wise training strategy to handle the overfitting problem caused by the imbalanced training. Finally, a Selecting-The-Biggest-Connected-Region (STBCR) is proposed to alleviate the over-segmentation problem of the fusion model. Experimental results:The method performance is compared using the same defined metrics (DICE, JAC, and ACC) with other state-of-the-art works on publicly available ISIC 2016, ISIC 2017, and ISIC 2018 databases, and results (0.931, 0.875, and 0.966), (0.881, 0.807, and 0.948), and (0.920, 0.856, and 0.966) illustrate its advantages. Conclusion:The excellent and robust performances on three public databases proved that our method has the potential to be applied to CAD skin diseases diagnosis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.