Abstract
Bayesian networks in their Factor Graph Reduced Normal Form are a powerful paradigm for implementing inference graphs. Unfortunately, the computational and memory costs of these networks may be considerable even for relatively small networks, and this is one of the main reasons why these structures have often been underused in practice. In this work, through a detailed algorithmic and structural analysis, various solutions for cost reduction are proposed. Moreover, an online version of the classic batch learning algorithm is also analysed, showing very similar results in an unsupervised context but with much better performance; which may be essential if multi-level structures are to be built. The solutions proposed, together with the possible online learning algorithm, are included in a C++ library that is quite efficient, especially if compared to the direct use of the well-known sum-product and Maximum Likelihood algorithms. The results obtained are discussed with particular reference to a Latent Variable Model structure.
Highlights
IntroductionIn the Factor Graph in Reduced Normal Form (FGrn), through the use of replicator units (or equal constraints), the graph is reduced to an architecture in which each variable is connected to two factors at most (Palmieri 2016); with belief messages that flow bidirectionally into the network
The Factor Graph (FG) representation, and in particular the so-called Normal Form (FGn) (Forney 2001; Loeliger 2004), is a very appealing formulation to visualize and manipulate Bayesian graphs; representing their relative joint probability by assigning variables to arcs and functions to nodes.in the Factor Graph in Reduced Normal Form (FGrn), through the use of replicator units, the graph is reduced to an architecture in which each variable is connected to two factors at most (Palmieri 2016); with belief messages that flow bidirectionally into the network
In the Factor Graph in Reduced Normal Form (FGrn), through the use of replicator units, the graph is reduced to an architecture in which each variable is connected to two factors at most (Palmieri 2016); with belief messages that flow bidirectionally into the network
Summary
In the Factor Graph in Reduced Normal Form (FGrn), through the use of replicator units (or equal constraints), the graph is reduced to an architecture in which each variable is connected to two factors at most (Palmieri 2016); with belief messages that flow bidirectionally into the network. The various problems, related to the propagation and learning of probabilities within the FGrn paradigm, are addressed by focusing on the implementation of the Latent Variable Model (a) marginal distribution pV (v), which is proportional to the posterior given the observations anywhere else in the net-. LVMs can be used in a large number of applications and can be seen as a basic building block for more complex architectures
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.