Abstract
This thesis combines computability theory and various notions of fractal dimension, mainly Hausdorff dimension. An algorithmic approach to Hausdorff measures makes it possible to define the Hausdorff dimension of individual points instead of sets in a metric space. This idea was first realized by Lutz (2000). Working in the Cantor space of all infinite binary sequences, we study the theory of Hausdorff and other dimensions for individual sequences. After giving an overview over the classical theory of fractal dimension in Cantor space, we develop the theory of effective Hausdorff dimension and its variants systematically. Our presentation is inspired by the approach to algorithmic information theory developed by Kolmogorov and his students. We are able to give a new and much easier proof of a central result of the effective theory: Effective Hausdorff dimension coincides with the lower asymptotic algorithmic entropy, defined in terms of Kolmogorov complexity. Besides, we prove a general theorem on the behavior of effective dimension under r-expansive mappings, which can be seen as a generalization of Holder mappings in Cantor space. Furthermore, we study the connections between other notions of effective fractal dimension and algorithmic entropy. Besides, we are able to show that the set of sequences of effective Hausdorff dimension s has Hausdorff dimension s and infinite s-dimensional Hausdorff measure (for every 0<s<1). Next, we study the Hausdorff dimension (effective and classical) of objects arising in computability theory. We prove that the upper cone of any sequence under a standard reducibility has Hausdorff dimension 1, thereby exposing a Lebesgue nullset that has maximal Hausdorff dimension. Furthermore, using the behavior of effective dimension under r-expansive transformations, we are able to show that the effective Hausdorff dimension of the lower cone and the degree of a sequence coincide. For many-one reducibility, we prove the existence of lower cones of non-integral dimension. After giving some natural' examples of sequences of effective dimension 0, we prove that every effectively closed set A of positive Hausdorff dimension admits a computable, surjective mapping onto Cantor space. We go on to study the complex interrelation between algorithmic entropy, randomness, effective Hausdorff dimension, and reducibility more closely. For this purpose we generalize effective Hausdorff dimension by introducing the notion of strong effective Hausdorff measure 0. We are able to show that not having strong effective Hausdorff measure 0 does not necessarily allow to compute a Martin-Lof random sequence, a sequence of highest possible algorithmic entropy. Besides, we show that a generalization of the notion of effective randomness to noncomputable measures yields a very coarse concept of randomness in the sense that every noncomputable sequence is random with respect to some measure. Next, we introduce Schnorr dimension, a notion of dimension which is algorithmically more restrictive than effective dimension. We prove a machine characterization of Schnorr dimension and show that, on the computably enumerable sets, Schnorr Hausdorff dimension and Schnorr packing dimension do not coincide, in contrast to the case of effective dimension. We also study subrecursive notions of effective Hausdorff dimension. Using resource-bounded martingales, we are able to transfer the use of r-expansiveness to the resource-bounded case, which enables us to show that the Small-Span Theorem does not hold for dimension in exponential time E. Finally, we investigate the effective Hausdorff dimension of sequences against which no computable nonmonotonic betting strategy can succeed. Computable nonmonotonic betting games are a generalization of computable martingales, and it is a major open question whether the randomness notion induced by them is equivalent to Martin-Lof randomness. We are able to show that the sequences which are random with respect to computable nonmonotonic betting games have effective Hausdorff dimension 1, which implies that, from the viewpoint of algorithmic entropy, they are rather close to Martin-Lof randomness.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.