Abstract

Accurate skin lesion segmentation plays a fundamental role in computer-aided melanoma analysis. Recently, some FCN-based methods have been proposed and achieved promising results in lesion segmentation tasks. However, due to the variable shapes, different scales, noise interference, and ambiguous boundaries of skin lesions, the capabilities of lesion location and boundary delineation of these works are still insufficient. To overcome the above challenges, in this paper, we propose a novel Neighborhood Context Refinement Network (NCRNet) by using the coarse-to-fine strategy to achieve accurate skin lesion segmentation. The proposed NCRNet contains a shared encoder and two different but closely related decoders for locating the skin lesions and refining the lesion boundaries. Specifically, we first design the Parallel Attention Decoder (PAD), which can effectively extract and fuse the local detail information and global semantic information on multiple levels to locate skin lesions of different sizes and shapes. Then, based on the initial lesion location, we further design the Neighborhood Context Refinement Decoder (NCRD), aiming at leveraging the fine-grained multi-stage neighborhood context cues to refine the lesion boundaries continuously. Furthermore, the neighborhood-based deep supervision used in the NCRD can make the shared encoder pay more attention to the lesion boundary areas and promote convergence of the segmentation network. The public skin lesion segmentation dataset ISIC2017 is adopted to validate the effectiveness of the proposed NCRNet. Comprehensive experiments prove that the proposed NCRNet reaches the state-of-the-art performance than the other nine competitive methods and gets 78.62%, 86.55%, and 94.01% on Jaccard, Dice, and Accuracy, respectively.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.