Abstract

Event Abstract Back to Event A neural model of motion gradient detection for visual navigation Florian Raudies1*, Stefan Ringbauer1 and Heiko Neumann1 1 University of Ulm, Institute of Neural Information Processing, Germany Problem: Spatial navigation based on visual input (Fajen and Warren, TICS, 4, 2000) is important for tasks like steering towards a goal or collision avoidance of stationary as well as independently moving objects (IMOs), respectively. Such observer movement induces global motion patterns while obstacles and IMOs lead to local disturbances in the optical flow. How is this information about flow changes used to support navigation and what are the neural mechanisms which produce this functionality? Method:A biologically inspired model is proposed to estimate and integrate optical flow from a spatio-temporal sequence of images. This model employs a log-polar velocity space, where optical flow is represented using a population code (Raudies & Neumann, Neurocomp, 2008). By extending the model proposed in (Ringbauer et al., ICANN, 2007), motion gradients are locally calculated with respect to the flow direction (tangential) on the basis of population encoded optical flow. Gradients themselves are encoded in a population of responses for angular and speed differences which were independent of the underlying flow direction (Tsotsos, CVIU, 100, 2005). For motion prediction, estimated motion is modified according to the gradient responses and is fed back into the motion processing loop. Local flow changes estimated in model area MT are further integrated in model area MSTd to represent global motion patterns (Graziano, J. of Neuroscience, 14, 1994). Results:The proposed model was probed with several motion sequences, such as the flowergarden sequence (http://www-bcs.mit.edu/people/jyawang/demos/garden-layer/orig-seq.html) which contains motion parallax at different spatial scales. It is shown that motion parallax occurs in conjunction with occlusions and disocclusions, e.g. when the foreground is moving faster than the background. Employing motion gradients, disocclusions are detected as locations of local acceleration and occlusions as deceleration in model area MT (supplementary Fig.1). More complex configurations occur at motion boundaries of an IMO. A sequence is investigated which contains a rectangular IMO in front of a wall which is observed during slightly sidewards deflected forward movement. As in the flowergarden sequence local occlusions and disocclusions are detected at vertical boundaries of the IMO in model area MT. Additionally, not only the discriminating speed is encoded by the gradients but also the angular difference. Thus, gradients encode how different parts of foreground and background are moving relative to each other. Moreover, model area MST signals a global motion pattern of expansion as an indicator of spatial observer forward motion (supplementary Fig. 2). Conclusion:The role of motion gradients in navigation is twofold: (i) at model area MT local motion changes (e.g. accelerations/decelerations) are detected indicating obstacle or IMO boundaries while (ii) at model area MST global motion patterns (e.g. expansion) are encoded. If an IMO is present in the input sequence, this leads to the occurrence of motion gradients always; however, motion gradients are also detected in cases if no IMO is present, e.g. at depth discontinuities. Acknowledgments:Supported by BMBF 01GW0763(BPPL); Grad.School Univ.Ulm. Conference: Bernstein Conference on Computational Neuroscience, Frankfurt am Main, Germany, 30 Sep - 2 Oct, 2009. Presentation Type: Poster Presentation Topic: Abstracts Citation: Raudies F, Ringbauer S and Neumann H (2009). A neural model of motion gradient detection for visual navigation. Front. Comput. Neurosci. Conference Abstract: Bernstein Conference on Computational Neuroscience. doi: 10.3389/conf.neuro.10.2009.14.018 Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters. The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated. Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed. For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions. Received: 25 Aug 2009; Published Online: 25 Aug 2009. * Correspondence: Florian Raudies, University of Ulm, Institute of Neural Information Processing, Ulm, Germany, florian.raudies@uni-ulm.de Login Required This action requires you to be registered with Frontiers and logged in. To register or login click here. Abstract Info Abstract The Authors in Frontiers Florian Raudies Stefan Ringbauer Heiko Neumann Google Florian Raudies Stefan Ringbauer Heiko Neumann Google Scholar Florian Raudies Stefan Ringbauer Heiko Neumann PubMed Florian Raudies Stefan Ringbauer Heiko Neumann Related Article in Frontiers Google Scholar PubMed Abstract Close Back to top Javascript is disabled. Please enable Javascript in your browser settings in order to see all the content on this page.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.