Abstract

The neural network discussed in this paper is a self trained network for LArge Memory STorage And Retrieval (LAMSTAR) of information. It employs features such as forgetting, interpolation, extrapolation and filtering, to enhance processing and memory efficiency and to allow zooming in and out of memories. The network is based on modified SOM (Self-Organizing-Map) modules and on arrays of link-weight vectors to channel information vertically and horizontally throughout the network. Direct feedback and up/down counting serve to set these link weights as a higher-hierarchy performance evaluator element which also provides high level interrupts. Pseudo random modulation of the link weights prevents dogmatic network behavior. The input word is a coded vector of several sub-words (sub-vectors). These features facilitate very rapid intelligent retrieval and diagnosis of very large memories, that have properties of a self-adaptive expert system with continuously adjustable weights. The authors have applied the network to a simple medical diagnosis and fault detection problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call