Abstract

Markov networks and Bayesian networks are two popular models for classification. Vapnik–Chervonenkis dimension and Euclidean dimension are two measures of complexity of a class of functions, which can be used to measure classification capability of classifiers. One can use Vapnik–Chervonenkis dimension of the class of functions associated with a classifier to construct an estimate of its generalization error. In this paper, we study Vapnik–Chervonenkis dimension and Euclidean dimension of concept classes induced by discrete Markov networks and Bayesian networks. We show that these two dimensional values of the concept class induced by a discrete Markov network are identical, and the value equals dimension of the toric ideal corresponding to this Markov network as long as the toric ideal is nontrivial. Based on this result, one can compute the dimensional value in terms of a computer algebra system directly. Furthermore, for a general Bayesian network, we show that dimension of the corresponding toric ideal offers an upper bound of Euclidean dimension. In addition, we illustrate how to use Vapnik–Chervonenkis dimension to estimate generalization error in binary classification.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.