Abstract

Image classification is a very popular machine learning domain in which deep convolutional neural networks have mainly emerged on such applications. These networks manage to achieve remarkable performance in terms of prediction accuracy but they are considered as black box models since they lack the ability to interpret their inner working mechanism and explain the main reasoning of their predictions. There is a variety of real world tasks, such as medical applications, in which interpretability and explainability play a significant role. Making decisions on critical issues such as cancer prediction utilizing black box models in order to achieve high prediction accuracy but without provision for any sort of explanation for its prediction, accuracy cannot be considered as sufficient and ethnically acceptable. Reasoning and explanation is essential in order to trust these models and support such critical predictions. Nevertheless, the definition and the validation of the quality of a prediction model’s explanation can be considered in general extremely subjective and unclear. In this work, an accurate and interpretable machine learning framework is proposed, for image classification problems able to make high quality explanations. For this task, it is developed a feature extraction and explanation extraction framework, proposing also three basic general conditions which validate the quality of any model’s prediction explanation for any application domain. The feature extraction framework will extract and create transparent and meaningful high level features for images, while the explanation extraction framework will be responsible for creating good explanations relying on these extracted features and the prediction model’s inner function with respect to the proposed conditions. As a case study application, brain tumor magnetic resonance images were utilized for predicting glioma cancer. Our results demonstrate the efficiency of the proposed model since it managed to achieve sufficient prediction accuracy being also interpretable and explainable in simple human terms.

Highlights

  • Image classification is a very popular machine learning domain in which Convolutional NeuralNetworks (CNNs) [1] have been successfully applied on wide range of image classification problems.These networks are able to filter out noise and extract useful information from the initial images’pixel representation and use it as input for the final prediction model

  • Framework framework,Prediction which will be described in subsection, a white box linear model and an explanation extraction framework

  • We present our experimental results regarding to the proposed explainable prediction framework for image classification tasks, applying it on glioma prediction from MRI as a case study application scenario

Read more

Summary

Introduction

Image classification is a very popular machine learning domain in which Convolutional NeuralNetworks (CNNs) [1] have been successfully applied on wide range of image classification problems.These networks are able to filter out noise and extract useful information from the initial images’pixel representation and use it as input for the final prediction model. Networks (CNNs) [1] have been successfully applied on wide range of image classification problems. These networks are able to filter out noise and extract useful information from the initial images’. Pixel representation and use it as input for the final prediction model. CNN-based models are able to achieve remarkable prediction performance in general they need very large number of. J. Imaging 2020, 6, 37 input instances. This model’s great limitation and drawback is that it is almost totally unable to interpret and explain its predictions, since its inner workings and its prediction function is not transparent due to its high complexity mechanism [2]

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call