Abstract

Bayesian networks are graphical tools used to represent a high-dimensional probability distribution. They are used frequently in machine learning and many applications such as medical science. This paper studies whether the concept classes induced by a Bayesian network can be embedded into a low-dimensional inner product space. We focus on two-label classification tasks over the Boolean domain. For full Bayesian networks and almost full Bayesian networks with n variables, we show that VC dimension and the minimum dimension of the inner product space induced by them are 2n-1. Also, for each Bayesian network N we show that VCdim(N)=Edim(N)=2n-1+2i if the network N′ constructed from N by removing Xn satisfies either (i) N′ is a full Bayesian network with n-1 variables, i is the number of parents of Xn, and i<n-1 or (ii) N′ is an almost full Bayesian network, the set of all parents of XnPAn={X1,X2,Xn3,…,Xni} and 2⩽i<n-1. Our results in the paper are useful in evaluating the VC dimension and the minimum dimension of the inner product space of concept classes induced by other Bayesian networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call