Abstract
Skin diseases are a significant global public health concern, affecting 21-85% of the world's population, particularly those in low- and middle-income countries. Accurate and timely diagnosis is crucial for effective treatment and improved patient outcomes. This study introduces a novel deep-learning multi-model architecture designed for high-precision skin disease diagnosis. The system employs a five-category Xception model to classify skin lesions into five classes: Atopic Dermatitis, Acne and Rosacea, Skin Cancer, Bullous, and Others. Trained on 25,010 images, the model achieved 95% accuracy and an AUROC of 99.4%. To further enhance accuracy, transfer learning was applied, resulting in specialized models for each class, with strong performance across 40 skin conditions. Specifically, the Acne and Rosacea model achieved an accuracy of 90.0%, with a precision of 90.7%, recall of 90.1%, f1-score of 90.2%, and an AUROC of 99.0%. The Skin Cancer model demonstrated 94.0% accuracy, 94.8% precision, 94.2% recall, 94.1% f1-score, and a 99.5% AUROC. The Atopic Dermatitis model reported 91.8% accuracy, 92.2% precision, 91.8% recall, 91.9% f1-score, and a 98.8% AUROC. Finally, the Bullous model showed 90.0% accuracy, 90.6% precision, 90.0% recall, 90.0% f1-score, and a 98.9% AUROC. This approach surpasses previous studies, offering a more comprehensive diagnostic tool for skin diseases. To facilitate result reproducibility, the training and testing codes for the models utilized in this study are accessible via the GitHub repository ( https://github.com/SaraEl-Metwally/A-Multi-Model-Deep-Learning-for-Diagnosing-Skin-Diseases ).
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.