Abstract

Dermatologists typically require extensive experience to accurately classify skin cancer. In recent years, the development of computer vision and machine learning has provided new methods for assisted diagnosis. Existing skin cancer image classification methods have certain limitations, such as poor interpretability, the requirement of domain knowledge for feature extraction, and the neglect of lesion area information in skin images. This paper proposes a new genetic programming (GP) approach to automatically learn global and/or local features from skin images for classification. To achieve this, a new function set and a new terminal set have been developed. The proposed GP method can automatically and flexibly extract effective local/global features from different types of input images, thus providing a comprehensive description of skin images. A new region detection function has been developed to select the lesion areas from skin images for feature extraction. The performance of this approach is evaluated on three skin cancer image classification tasks, and compared with three GP methods and six non-GP methods. The experimental results show that the new approach achieves significantly better or similar performance in most cases. Further analysis validates the effectiveness of our parameter settings, visualizes the multiple region detection functions used in the individual evolved by the proposed approach, and demonstrates its good convergence ability.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.