Abstract

The application of information theory to biology can be broadly split into three areas: (i) At the level of the genome; considering the storage of information using the genetic code. (ii) At the level of the individual animal; communication between animals passes information from one animal to another (usually, but not always, for mutual benefit). (iii) At the level of the population; the diversity of a population can be measured using population entropy. This paper is concerned with the second area. We consider the evolution of an individual's ability to obtain and process information using the ideas of evolutionary game theory. An important part of game theory is the definition of the information available to the participants. Such games tend to treat information as a static quantity whilst behaviour is strategic. We consider game theoretic modelling where use of information is strategic and can thus evolve. A simple model is developed which shows how the information acquiring ability of animals can evolve through time. The model predicts that it is likely that there is an optimal level of information for any particular contest, rather than more information being inherently better. The total information required for optimal performance corresponded to approximately the same entropy, regardless of the value of the individual pieces of information concerned.

Highlights

  • THh=e c;laPssiicpailloegn(tprio)p. y of an observation with discrete probability distribution is de ned asThis useful concept is of value in describing the information available in living systems in a variety of contexts, and is a bedrock of classical information theory

  • It is assumed that communication is of bene t to both communicating parties. In particular they model the way that particular signals evolve to gain speci c meanings. It is shown in Plotkin and Nowak (2000) that if there is a chance of mistaking signals for others evolution leads to a given error limit, and that this limit has a natural interpretation in classical information theory

  • A simple model has been developed which shows how the information acquiring ability of animals can evolve through time

Read more

Summary

Introduction

THh=e c;laPssiicpailloegn(tprio)p. y of an observation with discrete probability distribution (pi) is de ned as. For some other examples of the use of these ideas, see Borodovsky and Peresetsky (1994), Hannenhalli and Russell (2000) and Kawashima et al (1994) Another type of application is at the level of the population. The evolution of language has been considered in terms of information theory in a series of papers by Nowak and collaborators (Nowak and Krakauer, 1999 Nowak et al 1999 Plotkin and Nowak, 2000). In particular they model the way that particular signals evolve to gain speci c meanings It is shown in Plotkin and Nowak (2000) that if there is a chance of mistaking signals for others evolution leads to a given error limit, and that this limit has a natural interpretation in classical information theory

Information and Game Theory
Discussion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call