Abstract

Algorithmic statistics is a part of algorithmic information theory (Kolmogorov complexity theory) that studies the following task: given a finite object x (say, a binary string), find an ‘explanation’ for it, i.e., a simple finite set that contains x and where x is a ‘typical element’. Both notions (‘simple’ and ‘typical’) are defined in terms of Kolmogorov complexity. It is known that this cannot be achieved for some objects: there are some “non-stochastic” objects that do not have good explanations. In this paper we study the properties of maximally non-stochastic objects; we call them “antistochastic”. In this paper, we demonstrate that the antistochastic strings have the following property (Theorem 6): if an antistochastic string x has complexity k, then any k bit of information about x are enough to reconstruct x (with logarithmic advice). In particular, if we erase all but k bits of this antistochastic string, the erased bits can be restored from the remaining ones (with logarithmic advice). As a corollary we get the existence of good list-decoding codes with erasures (or other ways of deleting part of the information). Antistochastic strings can also be used as a source of counterexamples in algorithmic information theory. We show that the symmetry of information property fails for total conditional complexity for antistochastic strings. An extended abstract of this paper was presented at the 10th International Computer Science Symposium in Russia (Milovanov, 2015).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call