Abstract

In this paper, new techniques that allow conditional entropy to estimate the combinatorics of symbols are applied to animal communication studies to estimate the communication's repertoire size. By using the conditional entropy estimates at multiple orders, the paper estimates the total repertoire sizes for animal communication across bottlenose dolphins, humpback whales, and several species of birds for N-grams length one to three. In addition to discussing the impact of this method on studies of animal communication complexity, the reliability of these estimates is compared to other methods through simulation. While entropy does undercount the total repertoire size due to rare N-grams, it gives a more accurate picture of the most frequently used repertoire than just repertoire size alone.

Highlights

  • The complexity of animal communication is a topic frequently discussed, but difficult to resolve.While it is beyond dispute that many species communicate, even the basic purposes of these communications, whether to communicate information or to just influence the behavior of others to increase their own fitness, is hotly debated [1,2,3,4,5]

  • Measuring animal communication in terms of the entropy in bits, these studies have attempted to look at the animal communication structure at various lengths (N-grams) in order to determine the structure of the communications, whether the tools of information theory can lend themselves to a better understanding of animal behavior and, possibly, what types of information can be communicated

  • The information graphs will still be shown as an illustration of the results of the studies on each animal communication and should be used with caution to establish the complexity of sequences

Read more

Summary

Introduction

The complexity of animal communication is a topic frequently discussed, but difficult to resolve. The complexity of animal language has been studied using many methods, including various techniques to estimate repertoire size, such as curve-fitting [8,9] and capture-recapture [9,10,11,12]. Other methods use information theory either by measurements of conditional entropy [13,14] or using other methods, such as entropy rate and Lempel-Ziv complexity [15]. Measuring animal communication in terms of the entropy in bits, these studies have attempted to look at the animal communication structure at various lengths (N-grams) in order to determine the structure of the communications, whether the tools of information theory can lend themselves to a better understanding of animal behavior and, possibly, what types of information can be communicated

Information Theory and Animal Communication
Information Graphs and Order Complexity
Bias Measures in Entropy Estimates
Combinatorics of Information Theory and Repertoire Size
Combinatorics and Entropy Bias Estimates
Animal Communication
Bottlenose Dolphins
Humpback Whales
Wood Thrushes and Robins
European Skylarks
European Starlings
Animal Communication Entropy Data and Repertoire Estimates
Other Repertoire Counting Methods and Simulation
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call