Abstract

Melanoma is a malignant tumor condition which can be successfully cured with early detection and treatment. Currently, deep learning is one of the research hotspots in automatic diagnosis, with two primary categories of medical image segmentation applications: Convolution Neural Network (CNN) methods such as U-Net and Transformer-based methods. Despite achieving remarkable performance in image segmentation, both above approaches have some inherent shortcomings that cannot be overlooked. The convolution operation in the CNN structure cannot adequately capture global dependencies, leading to a major negative impact on segmentation performance. While the Multi-head Self-Attention mechanism in Transformer can efficiently extract extensive global features, its large computational complexity and lack of local induction bias cannot be ignored. These factors result in a significant reduction in the precision of clinical diagnosis using automated segmentation. To address the aforementioned issues, this paper proposes a U-shaped network structure called PHCU-Net which comprises global feature block, local feature block and dual-branch hierarchical attention mechanism that is designed specifically for the segmentation network of skin lesions. In addition, we argue that the traditional skip connection can be further improved to acquire stronger contexts by simply incorporating convolution attention, so that the decoder can fully collect the global and local feature information. The aforementioned innovative work empowers our model to generate superior skin lesion segmentation results compared to other classical algorithms and deep learning-based models. Extensive experimental results on three open-source datasets (ISIC2017, ISIC2018, and PH2) demonstrate the effectiveness of the proposed PHCU-Net in melanoma lesion segmentation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.