Abstract

This paper proposes a novel self-organising associative neural network model in terms of kernel memory. The objective of this paper is not to give a sophisticated learning scheme and its rigorous mathematical accounts but rather attempt to address a paradigm shift, which could potentially answer a number of critical issues related to the current artificial neural network architectures. In the new memory model, the notion of ‘weights’ between the nodes is totally different from that as in ordinary neural network models, in which the weights simply represent the strengths of the connections between pairs of nodes in the kernel memory each realised by a kernel unit. Hence any arduous and iterative tuning of weight parameters is not involved and thereby the neural memory does not inherently suffer from any numerically-related problems. The associative memory is constructed via a simple unsupervised learning algorithm motivated from the traditional Hebbian principle. In the simulation study, both the plasticity and performance of the novel neural network architecture are discussed within the pattern classification context through single and simultaneous multi-domain classification tasks.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call