Event Abstract Back to Event Decoding spatial information in population of neurons for the ocular following response Laurent Perrinet1* 1 INCM, CNRS, France Short presentation of a large moving pattern elicits an Ocular Following Response (OFR) that exhibits many of the properties attributed to low-level motion processing such as spatial and temporal integration, contrast gain control and divisive interaction between competing motions. Similar mechanisms have been demonstrated in V1 cortical activity in response to center-surround gratings patterns measured with real-time optical imaging in awake monkeys. More recent experiments of OFR have used disk gratings and bipartite stimuli which are optimized to study the dynamics of center-surround integration. We quantified two main characteristics of the global spatial integration of motion from an intermediate map of possible local translation velocities: (i) a finite optimal stimulus size for driving OFR, surrounded by an antagonistic modulation and (ii) a direction selective suppressive effect of the surround on the contrast gain control of the central stimuli.We explore here an hypothesis where this could be understood as the effect of a recurrent prediction of information in the velocity map. In fact, in previous models, the integration step assumes independence of the local information while natural scenes are very predictable: Due to the rigidity and inertia of physical objects in visual space, neighboring local spatiotemporal information is redundant and one may introduce this a priori knowledge of the statistics of the input in the ideal observer model. We implement this in a realistic model of a layer representing velocities in a map of cortical columns, where predictions are implemented by lateral interactions within the cortical area. First, raw velocities are estimated locally from images and are propagated to this area in a feed-forward manner. Results show that this simple model is sufficient to disambiguate characteristic patterns such as line endings in the Barber-Pole illusion. Due to the recursive network which is modulating the velocity map, it also explains that the representation may exhibit some memory, such as when an object suddenly disappears or when presenting a dot followed by a line (line-motion illusion). Conference: Neuroinformatics 2009, Pilsen, Czechia, 6 Sep - 8 Sep, 2009. Presentation Type: Poster Presentation Topic: Computational neuroscience Citation: Perrinet L (2019). Decoding spatial information in population of neurons for the ocular following response. Front. Neuroinform. Conference Abstract: Neuroinformatics 2009. doi: 10.3389/conf.neuro.11.2009.08.119 Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters. The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated. Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed. For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions. Received: 25 May 2009; Published Online: 09 May 2019. * Correspondence: Laurent Perrinet, INCM, CNRS, 13331 Marseille, France, laurent.perrinet@univ-amu.fr Login Required This action requires you to be registered with Frontiers and logged in. To register or login click here. Abstract Info Abstract The Authors in Frontiers Laurent Perrinet Google Laurent Perrinet Google Scholar Laurent Perrinet PubMed Laurent Perrinet Related Article in Frontiers Google Scholar PubMed Abstract Close Back to top Javascript is disabled. Please enable Javascript in your browser settings in order to see all the content on this page.