Abstract

Prediction systems are an important aspect of intelligent decisions. In engineering practice, the complex system structure and the external environment cause many uncertain factors in the model, which influence the modeling accuracy of the model. The belief rule base (BRB) can implement nonlinear modeling and express a variety of uncertain information, including fuzziness, ignorance, randomness, etc. However, the BRB system also has two main problems: Firstly, modeling methods based on expert knowledge make it difficult to guarantee the model’s accuracy. Secondly, interpretability is not considered in the optimization process of current research, resulting in the destruction of the interpretability of BRB. To balance the accuracy and interpretability of the model, a self-growth belief rule base with interpretability constraints (SBRB-I) is proposed. The reasoning process of the SBRB-I model is based on the evidence reasoning (ER) approach. Moreover, the self-growth learning strategy ensures effective cooperation between the data-driven model and the expert system. A case study showed that the accuracy and interpretability of the model could be guaranteed. The SBRB-I model has good application prospects in prediction systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call