Abstract
The present research examines the enhancement of skin lesion segmentation with U-Net++. Achieving accurate classification of dermoscopy images is heavily contingent on the precise segmentation of skin lesions or nodules. However, this task is considerably challenging due to the elusive edges, irregular perimeters, and variations both within and across lesion classes. Despite numerous existing algorithms for segmenting skin lesions from dermoscopic images, they often fall short of industry benchmarks in terms of precision. To address this, our research introduces a novel U-Net++ architecture, implementing tailored adjustments to feature map dimensions. The aim is to significantly enhance automated segmentation precision for dermoscopic images. Our evaluation involved a comprehensive assessment of the model's performance, encompassing an exploration of various parameters such as epochs, batch size, and optimizer selections. Additionally, we conducted extensive testing using augmentation techniques to bolster the image volume within the HAM10000 dataset. A key innovation in our research is the integration of a hair removal process into the U-Net++ algorithm, significantly enhancing image quality and subsequently leading to improved segmentation accuracy. The results of our proposed method demonstrate substantial advancements, showcasing an impressive Mean Intersection over Union (IoU) of 84.1%, a Mean Dice Coefficient of 91.02%, and a Segmentation Test Accuracy of 95.10%. Our suggested U-Net++ algorithm does a better job of segmenting than U-Net, Modified U-Net, K-Nearest Neighbors (KNN), and Support Vector Machine (SVM). This shows that it could be used to improve dermoscopy image analysis. Our proposed algorithm shows remarkable improvement in both observational outcomes and classifier performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.