High-accuracy parameter learning in Bayesian Networks (BNs) is a key challenge in real-time decision support applications, particularly when the available data are limited. Prior/Expert knowledge was introduced to eliminate the drawbacks of insufficient information; however, this method is subjective. In this study, we explored the use of monotonicity constraints to control the causal relationships between the nodes and their parents-nodes in BNs and proposed a new learning algorithm called global domain monotonicity based on maximum information entropy (GDM-MIE), which was designed for parameter learning in uncertain discrete BNs with nonlinear equality constraints when only finite data are available. In the proposed algorithm, a class of monotonicity is encoded as a constraint on information entropy and the parameter learning problem is transformed into constraints among the node parameters based on a known network. Furthermore, we considered the parameters as uncertain entropy information and discussed the monotonicity among parameters in the global spatial domain, proving the accuracy of the logical relationships of the model, system reliability, and time complexity. Finally, the proposed method was validated using standard BNs, and its performance was analyzed by comparing the proposed method with the existing learning algorithms. The results showed that the proposed method is more accurate and has better Kullback–Leibler divergence. To revalidate the rationality of the proposed method, the Alarm and Asia networks were employed as special cases. The GDM-MIE was found to achieve the intended goal of observing the estimated parameters by closely approximating the original real parameters with a small sample size, indicating that the proposed algorithm can served as an efficient and feasible method for learning Bayesian parameter.
Read full abstract