Abstract

Many theories of nonhuman animal communication posit a first-order Markov model in which the next signal depends only on the current one. Such a model precludes a hierarchical structure to the communication signal. Information theory and signal processing provide quantitative techniques to estimate the underlying complexity of an arbitrary signal or symbol sequence. These techniques are applied to humpback whale songs and demonstrate that any first-order Markov model fails to attain the underlying bound of complexity in these songs. Humpback songs are symbolized into alphabet sequences using spectrograms and self-organizing neural nets [Walker, unpublished]. The entropy of the song sequence is measured with a first-order parametric Markov model, and with a nonparametric sliding window method [Kontoyiannis et al., IEEE Trans. Info. Theory 44, 1319–1327 (1998)]. Preliminary analyses suggest that the entropy of the first-order Markov model is significantly higher than that of the nonparametric model, meaning that any first-order Markov source cannot reasonably model humpback songs. Furthermore, it is found that the symbolized song statistics are locally but not globally stationary, implying that these songs possess a hierarchical structure. [Work supported by NSF Ocean Sciences.]

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.