Abstract

This is a survey of results on universal algorithms for classification and prediction of stationary processes. The classification problems include discovering the order of a k-step Markov chain, determining memory words in finitarily Markovian processes and estimating the entropy of an unknown process. The prediction problems cover both discrete and real valued processes in a variety of situations. Both the forward and the backward prediction problems are discussed with the emphasis being on pointwise results. This survey is just a teaser. The purpose is merely to call attention to results on classification and prediction. We will refer the interested reader to the sources. Throughout the paper we will give illuminating examples.

Highlights

  • Fourty five years ago David Bailey wrote a PhD thesis under the direction of Donald Ornstein [4] entitled “Sequential schemes for classifying and predicting ergodic processes”

  • He showed that for each k there was a sequence of functions gn which when applied to X0, X1, ...Xn would with probability one eventually equal YES/NO according to the alternative “the process IS/IS NOT a k-step mixing Markov chain”

  • In contrast to Bailey’s negative result for two valued decision schemes, we show that there is a sequence of functions gn which when applied to the outputs X0, X1, ...Xn of any ergodic process will converge with probability one to the order k if the process is k-step Markov and to infinity otherwise

Read more

Summary

Introduction

Fourty five years ago David Bailey wrote a PhD thesis under the direction of Donald Ornstein [4] entitled “Sequential schemes for classifying and predicting ergodic processes”. Several authors have extended this to bounded real valued processes using quantization to reduce to the finite valued case see for example Algoet [1, 3], Morvai [53], Morvai Yakowitz and Gyorfi [56] Another approach to the sequential prediction used a weighted average of expert schemes, and with these schemes the results were extended to the general unbounded case by Nobel [80] and Gyorfi and Ottucsak[28], (see the survey of Feder and Merhav [50]). A more general result giving an estimate for the conditional mean along a stopping time sequence will be described for stationary Gaussian (not necessarily ergodic) processes that include a much wider class of processes than that in Schafer [100]. Throughout the survey we will give specific examples to illustrate the ideas

Discovering features of a process by sequential sampling
Estimating the order of a Markov chain
Classification for special processes
On classifying general processes
Finite observability and entropy
Estimation for finitarily Markovian processes
Estimation of the memory length for finitarily Markovian processes
On estimating the residual waiting time
Pointwise sequential estimation of the conditional expectation in Cesaro mean
Pointwise consistent intermittent estimation schemes
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.