Abstract

Brain and behavioural asymmetries have been documented in various taxa. Many of these asymmetries involve preferential left and right eye use. However, measuring eye use through manual frame-by-frame analyses from video recordings is laborious and may lead to biases. Recent progress in technology has allowed the development of accurate tracking techniques for measuring animal behaviour. Amongst these techniques, DeepLabCut, a Python-based tracking toolbox using transfer learning with deep neural networks, offers the possibility to track different body parts with unprecedented accuracy. Exploiting the potentialities of DeepLabCut, we developed Visual Field Analysis, an additional open-source application for extracting eye use data. To our knowledge, this is the first application that can automatically quantify left–right preferences in eye use. Here we test the performance of our application in measuring preferential eye use in young domestic chicks. The comparison with manual scoring methods revealed a near perfect correlation in the measures of eye use obtained by Visual Field Analysis. With our application, eye use can be analysed reliably, objectively and at a fine scale in different experimental paradigms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call