Abstract

APL functions provide concise, array-oriented expressions for describing connection-based computing. These give insights into a variety of neural network-like computer models. Functional decomposition into inherently parallel operation of collective or memory-based computing is effected with the identification of several command-like APL simulation functions.These functions serve to: (a) generate and configure initial values and structures for network layers, connection weights and thresholds, (b) strategically sample training sets to provide acyclic sequences of patterns, (c) update weights for memorizing or learning from the training set, (d) locate units effective for storing and retrieving memories, and (e) use a cue or noise-corrupted pattern for recalling or reconstructing memory associations established during training. Also, APL functions are used to accomodate for data values missing from the training patterns. Other simply described functions generate random fixed reference addresses and initialize modifiable weight values. Test patterns with specified levels of noise contamination are generated by simple APL expressions.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call