The realization of interpretable system behavior prediction is crucial for improving the credibility of prediction results and ensuring the reliability of complex systems. As an interpretable modeling approach, the application of belief rule base (BRB) in interpretable behavior prediction still faces two challenges: (1) The stochastic nature of the optimization process tends to undermine the interpretability of the model, which in turn reduces the reliability of the prediction results. (2) Different rules have different reliabilities due to the limitation of expert knowledge reliability. The parameters of different rules should be treated differently in the optimization process. Therefore, this paper proposes a new behavior prediction method based on interpretable BRB with rule reliability measurement (BRB-IR). First, interpretability criteria for the behavior prediction method are proposed to regulate the interpretability of the whole modeling process. Second, expert knowledge reliability is quantified, and on this basis, rule reliability before and after optimization are quantified to guide an interpretable optimization process and analyze the effectiveness of the optimization. Third, on the basis of quantified rule reliability, three interpretability constraint strategies are proposed to enhance the interpretability and reliability of the model. The BRB-IR validity is verified via diesel engine and lithium-ion battery experiments.
Read full abstract