Abstract

In the construction of a Bayesian network, it is always assumed that the variables starting from the same parent are conditionally independent. In practice, this assumption may not hold, and will give rise to incorrect inferences. In cases where some dependency is found between variables, we propose that the creation of a hidden node, which in effect models the dependency, can solve the problem. In order to determine the conditional probability matrices for the hidden node, we use a gradient descent method. The objective function to be minimised is the squared-error between the measured and computed values of the instantiated nodes. Both forward and backward propagation are used to compute the node probabilities. The error gradients can be treated as updating messages and can be propagated in any direction throughout any singly connected network. We used the simplest node-by-node creation approach for parents with more than two children. We tested our approach on two different networks in an endoscope guidance system and, in both cases, demonstrated improved results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.