The belief rules which extend the classical fuzzy IF-THEN rules with belief consequent parts have been widely used for classifier design due to their capabilities of building linguistic models interpretable to users and addressing various types of uncertainty. However, in the rule learning process, a high number of features generally results in a belief rule base with large size, which degrades both the classification accuracy and the model interpretability. Motivated by this challenge, the decision tree building technique which implements feature selection and model construction jointly is introduced in this paper to learn a compact and accurate belief rule base. To this end, a new fuzzy belief decision tree (FBDT) with fuzzy feature partitions and belief leaf nodes is designed: a fuzzy information gain ratio is first defined as the feature selection criterion for node fuzzy splitting and then the belief distributions are introduced to the leaf nodes to characterize the class uncertainty. Based on the initial rules extracted from the constructed FBDT, a joint optimization objective considering both classification accuracy and model interpretability is then designed to further reduce the rule redundancy. Experimental results based on real datasets show that the proposed FBDT-based classification method has much smaller rule base and better interpretability than other rule-based methods on the premise of competitive accuracy.
Read full abstract