Abstract

Abstract Connectionist research today is inhibited by two kinds of rigidity. First, a rigidity in architecture: the connectivity of most networks is fixed at the start by the programmer. This limits the universality of learning procedures. Secondly, a mental rigidity: the widespread assumption that a node's activation must be a real number and that activations should be combined using weighted sums. This paper explores the consequences of relaxing these rigidities. I describe a neural network for unsupervised pattern learning. Given an arbitrary environment of input patterns it grows into a configuration which allows it to represent the high-level regularities in the input. Like the Boltzmann machine, it runs in two phases: observing the environment and simulating the environment. It continually monitors its own performance and grows new nodes as the need for them is identified. Simulating the environment involves repeatedly choosing states to satisfy many constraints. The usual method is to maximize a ‘h...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call