Abstract

Humpback whale song research has focused on analyzing the full song structure rarely describing individual song units. Even less progress has been made in automatically distinguishing and classifying these individual units. Two different techniques were employed to study their call units, visual/aural and automated/statistical. Humpback whale songs were recorded in the Hawaiian Islands both remotely with an autonomous acoustic recorder and by a snorkeler with a portable digital tape recorder. Humpback whale song units collected by the autonomous acoustic recorders were aurally separated into 23 distinct units in a companion study. Song units collected by a snorkeler using the portable recorder off Maui were analyzed using a specialized Matlab script that defined 48 frequency and temporal parameters for each unit. From the 48 parameters, the units were separated into distinct categories using a multivariate categorical analysis. The distinct units were compared between the different techniques to gage if automated methods could be used in future humpback whale studies. After this comparison was made, a principal component analysis (PCA) determined which of the aforementioned 48 parameters were important in statistically distinguishing between the distinct units furthering our understanding of frequency and temporal importance in categorizing song structure.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.