Abstract

Early identification of melanocytic skin lesions increases the survival rate for skin cancer patients. Automated melanocytic skin lesion extraction from dermoscopic images using the computer vision approach is a challenging task as the lesions present in the image can be of different colors, there may be a variation of contrast near the lesion boundaries, lesions may have different sizes and shapes, etc. Therefore, lesion extraction from dermoscopic images is a fundamental step for automated melanoma identification. In this article, a watershed transform based on the fast fuzzy c-means (FCM) clustering algorithm is proposed for the extraction of melanocytic skin lesion from dermoscopic images. Initially, the proposed method removes the artifacts from the dermoscopic images and enhances the texture regions. Further, it is filtered using a Gaussian filter and a local variance filter to enhance the lesion boundary regions. Later, the watershed transform based on MMLVR (multiscale morphological local variance reconstruction) is introduced to acquire the superpixels of the image with accurate boundary regions. Finally, the fast FCM clustering technique is implemented in the superpixels of the image to attain the final lesion extraction result. The proposed method is tested in the three publicly available skin lesion image datasets, i.e., ISIC 2016, ISIC 2017 and ISIC 2018. Experimental evaluation shows that the proposed method achieves a good result.

Highlights

  • Skin cancer is quite prevalent throughout the world

  • The dominant orientation-based texture histogram equalization (DOTHE) algorithm comprises the following six steps: (i) Initially the entire image that is to be enhanced is divided into a number of blocks. (ii) To differentiate each image block into smooth and rough blocks, a variance threshold is applied to each one. (iii) The rough blocks are further divided into dominant and non-dominant orientation blocks, based on singular value decomposition (SVD) of the gradient vectors of the block. (iv) The intensity distribution or histogram is computed from dominant orientation blocks of the image. (v) Depending on the cumulative density function (CDF) of the input image, the texture histogram of the input image is mapped into a new dynamic range of the image. (vi) the texture-enhanced image is obtained using the mapped histogram

  • The structuring element (SE) used in Section 4.3.3 was of the disk type and had a size of 3

Read more

Summary

Introduction

Skin cancer is quite prevalent throughout the world. It affects both males and females of all ages. The system uses dermoscopic images where the diagnosis process comprises different stages, such as preprocessing, lesion extraction and classification of lesions that detects the melanoma by ignoring all the artifacts present in the affected region and segregates the skin lesion accurately from healthy skin. Literature shows that there are several challenges in effective lesion extraction, and to overcome them, the proposed method uses the local variance method instead of gradient based on boundary detection. The proposed method extracts the lesions by removing the undesired artifacts and enhancing the lesion regions as compared to the healthy skin regions. The proposed approach, being an unsupervised one, extracts the skin lesion in a better way because of the usage of the following process:. The proposed method uses the local variance method for accurate detection of boundary regions and helps to separate the lesions from healthy skin regions effectively.

Related Work
Texture Enhancement
Gaussian Filter
Local Variance for Boundary Region Extraction
Results and Discussion
Method
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.