Abstract

The segmentation of psoriasis skin lesions from RGB color images is a challenging task in the computer vision, due to poor illumination conditions, the irregular shapes and sizes of psoriasis lesions, fuzzy boundaries between the lesions and the surrounding skin, and various artifacts such as skin hairs and camera reflections. The manual segmentation of lesions is very time-consuming and laborious for the dermatologist, and various automatic lesion segmentation approaches have therefore been presented by researchers in the recent past. However, these existing state-of-the-art approaches have various limitations, such as being highly dependent on feature engineering, showing poor performance in terms of accuracy and failing to consider challenging cases, as explained above. In view of this, we present an automated psoriasis lesion segmentation method based on a modified U-Net architecture, referred as PsLSNet. The architecture consists of a 29-layer deep fully convolutional network, for extracting spatial information automatically. In U-Net architecture there are two paths namely contracting and extracting, which are connected as U-shape. The proposed convolutional neural network also provides accelerated training by reducing the covariate shift through the implementation of batch normalization and is capable of segmenting the lesion even in challenging cases such as under poor acquisition conditions and in the presence of artifacts. In our experiment, we use 5241 images of psoriasis lesions collected from 1026 psoriasis patients by a dermatologist. The experimental results show effective performance metrics such as a Dice coefficient of 93.03% and an accuracy of 94.80%, with 89.60% sensitivity and 97.60% specificity, values that are significantly higher than for existing approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.