Abstract

A thorough analysis of animal behavior is essential for examining the relationship between specific neuron activations with the elements of the external environment, behavior, or internal state. Machine learning techniques have made some advancements in automatic segmentation of animal behavior based on data concerning the location of animal body parts [1–3]. At present, these methods cannot achieve the level of segmentation accuracy desired or make correlations between an animal’s behavioral acts and key environmental factors. To address this issue, the authors have created a software package that can extract a variety of behavioral variables from video recordings of animals in experimental settings, enabling mathematical analysis of a behavioral act continuum. The identification of specific aspects of an animal’s anatomy is crucial for extracting a vast array of behavioral variables. In order to accomplish this task, our team employed DeepLabCut, an accessible toolkit for tracking experimental animal behavior that operates on the principle of transfer learning through deep neural networks. We have devised a technique to ascertain the positions of animal body parts in diverse behavioral situations, resulting in a body parts collection meeting two criteria: offering superior responsiveness to small motor movements of the animal and delivering a high percentage of correct body part locations. In scenarios employing camera shooting from above, such a collection encompasses the nose, ears, tail base, body center, forelimbs, hind limbs, and both flanks of the animal’s body. Next, we created software tools to extract and annotate behavioral variables from data on animal kinematics in various cognitive tasks. Our automated system comprises two main scripting modules: CreatePreset and BehaviorAnalyzer. The CreatePreset module interacts with users to select the type of arena geometry, object location, and necessary temporal and spatial parameters for analysis. The script’s result saves as a mat-file for analyzing the behavior of all experiment videos, assuming a constant relative position of the arena and the video camera alongside the experiment’s design. The BehaviorAnalyzer module conducts initial processing on time series data consisting of coordinates of an animal’s body parts. This results in the formation of a kinematogram, which details the kinematics of the body parts. The module then isolates individual behavioral acts of the animal and annotates its behavior based on motivational and environmental factors. Using mutual information-based methods, we analyzed the specialization of hippocampal CA1 neurons in animals as they explored arenas with varying degrees of novelty. Through the analysis, we have identified neurons that exhibit selectivity in relation to specific continuous kinematic parameters governing the posture and trajectory of the animal. These parameters include the animal’s location in the arena space (X and Y coordinates), as well as the speed and angle of rotation of the animal’s head (i.e. absolute orientation in the arena). Neurons specialized in discrete acts of behavior were identified, including rests, locomotions, freezing, rears, and acts of interaction with objects. Furthermore, a selective activation of neurons was found with regard to an additional set of distinct parameters, which combine the animal’s location in the arena and its speed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call