Abstract

Modeling self-organization of neural networks for unsupervised learning using Hebbian and anti-Hebbian plasticity has a long history in neuroscience. Yet derivations of single-layer networks with such local learning rules from principled optimization objectives became possible only recently, with the introduction of similarity matching objectives. What explains the success of similarity matching objectives in deriving neural networks with local learning rules? Here, using dimensionality reduction as an example, we introduce several variable substitutions that illuminate the success of similarity matching. We show that the full network objective may be optimized separately for each synapse using local learning rules in both the offline and online settings. We formalize the long-standing intuition of the rivalry between Hebbian and anti-Hebbian rules by formulating a min-max optimization problem. We introduce a novel dimensionality reduction objective using fractional matrix exponents. To illustrate the generality of our approach, we apply it to a novel formulation of dimensionality reduction combined with whitening. We confirm numerically that the networks with learning rules derived from principled objectives perform better than those with heuristic learning rules.

Highlights

  • The human brain generates complex behaviors via the dynamics of electrical activity in a network of ∼ 1011 neurons each making ∼ 104 synaptic connections

  • The variable substitution method we introduced in the previous section can be applied to other computational objectives in order to derive neural networks with local learning rules

  • Through transparent variable substitutions, we demonstrated why biologically plausible neural networks can be derived from similarity matching objectives, mathematically formalized the adversarial relationship between Hebbian feedforward and anti-Hebbian lateral connections using min-max optimization lending itself to a game-theoretical interpretation, and formulated dimensionality reduction tasks as optimizations of fractional powers of matrices

Read more

Summary

Introduction

The human brain generates complex behaviors via the dynamics of electrical activity in a network of ∼ 1011 neurons each making ∼ 104 synaptic connections. In most existing single-layer networks, Figure 1, Hebbian and antiHebbian learning rules were postulated rather than derived from a principled objective. Until recently, all derivations of single-layer networks from principled objectives led to biologically implausible non-local learning rules, where the weight of a synapse depends on the activities of neurons other than the two the synapse connects. Eliminating neural activity variables leads to a min-max objective in terms of feedforward and lateral synaptic weight matrices. This formalizes the long-held intuition about the adversarial relationship of Hebbian and anti-Hebbian learning rules.

Derivation of a mixed PSP from similarity matching
Offline PSP algorithm
Linearly stable fixed points of Algorithm 1 correspond to the PSP
Online neural min-max optimization algorithms
5: Plasticity
Derivation of PSW from constrained similarity matching
Offline PSW algorithm
Linearly stable fixed points of Algorithm 3 correspond to PSW
Online algorithm for PSW
Novel formulations of dimensionality reduction using fractional exponents
Numerical experiments
Discussion
A Proof of strong min-max property for PSP objective
B Taking a derivative using a chain rule
Proof of item 2
Proof of item 3
Proof of item 4
D Proof of strong min-max property for PSW objective
F Autapse-free similarity matching network with asymmetric lateral connectivity
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.