Facial expression studies in animal communication are essential. However, manual inspection methods are only practical for small datasets. Deep learning techniques can help discriminate facial configurations associated with vocalisations over large datasets. We extracted and labelled frames of different primate species, trained deep-learning models to identify key points on their faces, and computed distances between them to identify facial gestures. We used machine learning algorithms to classify vocalised and non-vocalised gestures across different species. The algorithms showed higher-than-chance correct classification rates, with some exceeding 90 %. Our work employs deep learning to map primate facial gestures and offers an innovative application of pose estimation systems. Our approach facilitates the investigation of facial repertoire across primate species and behavioural contexts, enabling comparative research in primate communication.
Read full abstract