Abstract

We consider a model of learning in which the successive observations follow a certain Markov chain. The observations are labeled according to a membership to some unknown target set. For a Markov chain with finitely many states we show that, if the target set belongs to a family of sets with a finite Vapnik-Chervonenkis (1995) dimension, then probably approximately correct (PAC) learning of this set is possible with polynomially large samples. Specifically for observations following a random walk with a state space /spl Xscr/ and uniform stationary distribution, the sample size required is no more than /spl Omega/(t/sub 0//1-/spl lambda//sub 2/log(t/sub 0/|/spl chi/|1//spl delta/)), where /spl delta/ is the confidence level, /spl lambda//sub 2/ is the second largest eigenvalue of the transition matrix, and t/sub 0/ is the sample size sufficient for learning from independent and identically distributed (i.i.d.) observations. We then obtain similar results for Markov chains with countably many states using Lyapunov function technique and results on mixing properties of infinite state Markov chains.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.