Abstract

Event Abstract Back to Event Optimal architectures for fast-learning, flexible networks Vladimir Itskov1*, Anda Degeratu2 and Carina Curto1 1 University of Nebraska-Lincoln, United States 2 Max Planck Institute, Germany New memories (patterns) in some brain areas, such as hippocampus, can be encoded quickly. Irrespective of the plasticity mechanism (or learning rule) used to encode patterns via changes in synaptic weights, rapid learning is perhaps most easily accomplished if new patterns can be learned via only small modifications of the initial synaptic weights. It may thus be desirable for fast-learning and flexible neural networks to have architectures which enable large numbers of patterns to be encoded by only small perturbations of the synaptic efficacies. What kinds of network architectures have this property? We define the perturbative capacity of a network to be the number of memory patterns that can be learned under small perturbations of the (effective) synaptic weights. We propose that candidate architectures for fast-learning, flexible networks should be networks with high perturbative capacity. What are optimal architectures that maximize a network’s perturbative capacity? We investigate this question for threshold-linear networks. Here the memory patterns encoded by the recurrent network are groups of neurons that co-fire at a stable fixed point for some external input. We prove that for an arbitrary matrix of effective synaptic weights, the network’s memory patterns correspond to stable submatrices. This enables us to study the perturbative capacity of a network by analyzing the eigenvalues of submatrices under small perturbations of the synaptic weights. In the case of symmetric threshold-linear networks, we find (analytically) the set of optimal network architectures that have maximal perturbative capacity. For these networks, any of the possible memory patterns can be selectively encoded via small perturbations of the synaptic weights. We show that these architectures correspond to a highly restricted set of possible sign patterns governing the effective interactions between principal neurons, and we completely describe these patterns. In particular, we find that at least one half of the effective weights must be inhibitory, and thus the optimal architectures reflect inhibition-stabilized networks with a significant level of inhibition. Finally, we study a larger class of threshold-linear networks (where the matrices are no longer assumed symmetric), and find that our qualitative results continue to hold. The optimal architectures we discover provide a benchmark for comparison to experimentally obtainable estimates of the effective interactions between pyramidal neurons in fast-learning networks. They also give clues as to the differences we might expect between recurrent networks in areas such as the hippocampus, and more rigid networks in areas such as primary sensory cortices, where it may be undesirable for the allowed response patterns to be sensitive to small perturbations in synaptic weights. Conference: Computational and Systems Neuroscience 2010, Salt Lake City, UT, United States, 25 Feb - 2 Mar, 2010. Presentation Type: Poster Presentation Topic: Poster session III Citation: Itskov V, Degeratu A and Curto C (2010). Optimal architectures for fast-learning, flexible networks. Front. Neurosci. Conference Abstract: Computational and Systems Neuroscience 2010. doi: 10.3389/conf.fnins.2010.03.00214 Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters. The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated. Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed. For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions. Received: 04 Mar 2010; Published Online: 04 Mar 2010. * Correspondence: Vladimir Itskov, University of Nebraska-Lincoln, Lincoln, United States, vitskov2@math.unl.edu Login Required This action requires you to be registered with Frontiers and logged in. To register or login click here. Abstract Info Abstract The Authors in Frontiers Vladimir Itskov Anda Degeratu Carina Curto Google Vladimir Itskov Anda Degeratu Carina Curto Google Scholar Vladimir Itskov Anda Degeratu Carina Curto PubMed Vladimir Itskov Anda Degeratu Carina Curto Related Article in Frontiers Google Scholar PubMed Abstract Close Back to top Javascript is disabled. Please enable Javascript in your browser settings in order to see all the content on this page.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call