Event Abstract Back to Event Measuring behavioral perfomance in neurosensory experiments: the free tracking-framework (FTF) Friedrich Kretschmer1*, Malte Ahlers2 and Jutta Kretzberg1 1 Carl-von-Ossietzky-Universität Oldenburg, Department of Computational Neuroscience, Germany 2 Carl-von-Ossietzky-Universität Oldenburg, Department of Neurobiology, Germany One of the main goals of neuroscience is to understand the neuronal basis of animal behavior. To approach this question experimentally, it is crucial to measure the animal's behavior with well-defined stimuli. By using the same stimuli for behavioral and neurophysiological experiments, neuronal responses can be related to behavioral performance. With this approach, it is possible to determine boundaries of the sensory system and to test hypotheses about the neural codes used in neurosensory information processing. For example, the combination of behavioral and neurophysiological experiments is very fruitful for the analysis of genetically modified animals, because it allows to characterize the function of the protein corresponding to the knocked out gene on a systemic level. For this purpose we have developed a universal single camera recording-system suitable for automated tracking of body, head and eye movements in small animals. Combined with a well-defined sensory stimulation, this system allows the reconstruction of the stimulus situation that was present at the receptor neurons during the experiment, providing the basis for neurophysiological experiments. The system tracks artificial and/or natural markers on the animal and estimates their 3D Position. For this purpose the framework combines several techniques from computer vision research, including contour-, region- and pattern-based tracking algorithms. 3D estimation is done by automatically measuring relative distances of at least four static markers attached to the animal in a series of different camera views. This allows the computation of distances to the camera center during an experiment on the basis of the perspective strain (Perspective-N-Point problem). The modular implementation of this software-framework enables the use of different tracking algorithms and the application in many different experimental environments. Tracking is done in on- or offline mode depending on the complexity of the applied algorithms. The free tracking-framework is currently being evaluated in a series of eye- and head-tracking experiments in mice and turtles while visually stimulating with user-defined moving 360° images through a panoramic mirror. At the conference, we will demonstrate the current version of the free tracking-framework. We will show how it is possible to automatically calibrate the camera and how different artificial markers attached to moving objects are tracked and their 3D positions are estimated. The resolution of a standard webcam provides a sufficient preciseness of the 3D estimation for many applications, and the source-code of this academic project will become available in an open source form in the future. Therefore, the free tracking-framework is designed to provide the basis for low-cost experimental setups to track animal behavior to complement studies in neurosensory science. Conference: Neuroinformatics 2010 , Kobe, Japan, 30 Aug - 1 Sep, 2010. Presentation Type: Oral Presentation Topic: General neuroinformatics Citation: Kretschmer F, Ahlers M and Kretzberg J (2010). Measuring behavioral perfomance in neurosensory experiments: the free tracking-framework (FTF). Front. Neurosci. Conference Abstract: Neuroinformatics 2010 . doi: 10.3389/conf.fnins.2010.13.00078 Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters. The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated. Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed. For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions. Received: 14 Jun 2010; Published Online: 14 Jun 2010. * Correspondence: Friedrich Kretschmer, Carl-von-Ossietzky-Universität Oldenburg, Department of Computational Neuroscience, Oldenburg, Germany, friedrich.kretschmer@informatik.uni-oldenburg.de Login Required This action requires you to be registered with Frontiers and logged in. To register or login click here. Abstract Info Abstract The Authors in Frontiers Friedrich Kretschmer Malte Ahlers Jutta Kretzberg Google Friedrich Kretschmer Malte Ahlers Jutta Kretzberg Google Scholar Friedrich Kretschmer Malte Ahlers Jutta Kretzberg PubMed Friedrich Kretschmer Malte Ahlers Jutta Kretzberg Related Article in Frontiers Google Scholar PubMed Abstract Close Back to top Javascript is disabled. Please enable Javascript in your browser settings in order to see all the content on this page.