Abstract

The problem of sequentially scanning and predicting data arranged in a multidimensional array is considered. We introduce the notion of a scandictor, which is any scheme for the sequential scanning and prediction of such multidimensional data. The scandictability of any finite (probabilistic) data array is defined as the best achievable expected performance on that array. The scandictability of any (spatially) stationary random field on /spl Zopf//sup m/ is defined as the limit of its scandictability on finite boxes (subsets of /spl Zopf//sup m/), as their edges become large. The limit is shown to exist for any stationary field, and essentially be independent of the ratios between the box dimensions. Fundamental limitations on scandiction performance in both the probabilistic and the deterministic settings are characterized for the family of difference loss functions. We find that any stochastic process or random field that can be generated autoregressively with a maximum-entropy innovation process is optimally scandicted the way it was generated. These results are specialized for cases of particular interest. The scandictability of any stationary Gaussian field under the squared-error loss function is given a single-letter expression in terms of its spectral measure and is shown to be attained by the raster scan. For a family of binary Markov random fields (MRFs), the scandictability under the Hamming distortion measure is fully characterized.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call