Abstract

Interpreting the decision-making of black boxes in machine learning becomes urgent nowadays due to their lack of transparency. One effective way to interpret these models is to transform them into interpretable surrogate models such as decision trees and rule lists. Compared with other methods that open the black boxes, rule extraction is a universal method which can theoretically extend to any black boxes. However, in practice, it is not appropriate for deep learning models such as convolutional neural networks (CNNs), since the extracted rules or decision trees are too large to interpret and the rules are not at the semantic level. These two drawbacks limit the usability of rule extraction for deep learning models. In this paper, we adopt a new strategy to solve the problem. We first decompose a CNN into a feature extractor and a classifier. Then extract the decision tree only from the classifier. Then, we leverage lots of segmented labeled images to learn the concepts of each feature. This method can extract human-readable decision trees from CNNs. Finally, we build CNN2DT, a visual analysis system to enable users to explore the surrogate decision trees. Use cases show that CNN2DT provides global and local interpretations of the CNN decision process. Besides, users can easily find the misclassification reasons for single images and the discriminating capacity of different models. A user study has demonstrated the effectiveness of CNN2DT on AlexNet and VGG16 for image classification.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.