Abstract

Two well‐established robust laws in behavioural sciences are Zipf's law in Linguistics, and Bradford's Law in Informetrics. Both are similar power‐law functions. In an earlier work (1992) the authors developed an information theoretic model for Zipf's law based on the classical Shannon theory. A newer version, based on algorithmic information theory, was proposed in 1996. These two models are now extended to Bradford's law. The meaning of a discourse, ignored in Shannon's theory, is given special significance in the algorithmic information theory for language discourses and Zipf's law is shown to be a consequence of an optimum meaning‐preserving code of the discourse. Such a code exhibits characteristics of complex adaptive systems ‐ a mixture of elements of order and randomness. A complexity function was defined for a discourse, which is maximum for a state intermediate between order and disorder, and it was shown that it is nearly maximal for natural discourses. A general discussion of power law distributions reveals the uniqueness of Zipf's law as the only one associated with an optimal meaning‐preserving code. Power laws are a natural consequence of scale invariance; an elementary mathematical treatment is presented. Functions more complicated than a simple power law also show scale invariance and one such function may in fact conform to some observations. Some general comments about the nature of scientific laws conclude with a suggestion that Zipf's law may indeed qualify for the title ‘mathematical law’.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call