Skin cancer, particularly melanoma, poses a significant global health challenge due to its prevalence and mortality rate. Early detection is critical to improving outcomes, as advanced cases become increasingly difficult to treat. The advent of Artificial Intelligence (AI) and Explainable AI (XAI) techniques has revolutionized dermatological diagnostics by offering accurate and interpretable solutions. This systematic review investigates the integration of XAI in skin lesion classification, analyzing 22 recent studies published between 2019 and 2023. The studies encompass diverse approaches, including deep learning models like CNNs, ResNet, DenseNet, and MobileNet, as well as explainability techniques such as Grad-CAM, SHAP, and saliency maps. Results highlight significant advancements in accuracy and interpretability, with some models achieving over 99% accuracy on datasets like ISIC 2018 and HAM10000. However, challenges persist, including dataset imbalances, limited diversity in patient metadata, and generalizability across different skin types and imaging conditions. XAI methods, by visualizing model decision pathways, enhance transparency, fostering trust among clinicians and enabling seamless AI integration into clinical practice. This review underscores the potential of combining state-of-the-art AI models with explainable frameworks to address the complexities of skin lesion diagnostics. It advocates for future research to prioritize diverse, metadata-rich datasets, innovative optimization techniques, and robust architectures to develop reliable, interpretable diagnostic tools. By bridging the gap between advanced AI and user understanding, this work contributes to the creation of clinically applicable, trustable AI-driven healthcare solutions.
Read full abstract