BackgroundGenerating quantitative metrics of rodent locomotion and general behaviours from video footage is important in behavioural neuroscience studies. However, there is not yet a free software system that can process large amounts of video data with minimal user interventions. New methodHere we propose a new, automated rodent tracker (ART) that uses a simple rule-based system to quickly and robustly track rodent nose and body points, with minimal user input. Tracked points can then be used to identify behaviours, approximate body size and provide locomotion metrics, such as speed and distance. ResultsART was demonstrated here on video recordings of a SOD1 mouse model, of amyotrophic lateral sclerosis, aged 30, 60, 90 and 120days. Results showed a robust decline in locomotion speeds, as well as a reduction in object exploration and forward movement, with an increase in the time spent still. Body size approximations (centroid width), showed a significant decrease from P30. Comparison with existing method(s)ART performed to a very similar accuracy as manual tracking and Ethovision (a commercially available alternative), with average differences in coordinate points of 0.6 and 0.8mm, respectively. However, it required much less user intervention than Ethovision (6 as opposed to 30 mouse clicks) and worked robustly over more videos. ConclusionsART provides an open-source option for behavioural analysis of rodents, performing to the same standards as commercially available software. It can be considered a validated, and accessible, alternative for researchers for whom non-invasive quantification of natural rodent behaviour is desirable.
Read full abstract