Abstract

Feature importance and interaction are among the main issues in explainable artificial intelligence or interpretable machine learning. To measure feature importance and interaction, several methods, such as H-statistic and partial dependency, have been proposed. However, it is difficult to understand the practical implications of importance and interaction. In this paper, a new method for measuring feature importance and interaction is proposed. For the classification model, we observed correctly predicted cases in a predictive model and grouped them according to the characteristics of the cases. We derived a method for feature importance and interaction from group information. For the regression model, we grouped cases according to the change in the size of the prediction error. The proposed method supports the same rationale for feature importance and interaction. It also supports the decomposition of feature importance to feature power and feature interactions. To implement the proposed method, three visualization tools, including a feature interaction graph, are implemented. Through the proposed work, we can better understand the working mechanism of a predictive model.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call