Abstract

AbstractThe definition of descriptional complexity or algorithmic information in the sense of Kolmogorov or Chaitin is based on two important properties of computable functions, the existence of universal machines and the invariance under the choice of machine. Recently, the notion of descriptional complexity for finite-state computable functions has been introduced by Calude et al. For the latter theory, one cannot rely on the existence of universal machines, but bases the conclusions on an invariance theorem for finite transducers.This raises the question, which assumptions in algorithmic information theory are actually needed. We answer this question in a general setting, called encoded function space. Without any assumptions regarding encodings of functions and arguments and without any assumptions about computability or computing models, we introduce the notion of complexity. On this basis alone, a general invariance theorem is proved and sufficient conditions are stated for complexity to be computable. Next, universal functions are introduced, defined by pairing functions. It is shown that properties of the pairing functions, that is, of the joint encodings of functions and their inputs, determine the relation between the complexities measured according to different universal functions. In particular, without any other assumptions, for length-bounded or length-preserving pairing functions one can prove that complexity is independent of the choice of the universal function up to an additive constant. Some of the fundamental results of algorithmic information theory are obtained as corollaries.KeywordsTuring MachineComputable FunctionAdditive ConstantUniversal FunctionPairing FunctionThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call