Abstract

How much information should senders try to provide and receivers try to extract from signals? Using a simple but generalizable model of communication, it will be shown that the net payoff to either party for communicating, called the value of information, is not a linear, or even monotonic, function of the amount of information. Because receivers invariably have a default (no-signal) strategy that is better than chance, a signal must exceed a minimal threshold of accurate information before receivers should attend to it. Because costs of producing and analyzing increasingly accurate signals tend to rise faster than benefits above the minimal threshold, there is also an upper limit on how much information is worth the effort. These conclusions have important implications for how new signals are likely to evolve, and how much error we should expect to see in real animal communication systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call