Abstract

Complex‐valued associative memories (CAMs) are one of the most promising associative memory models by neural networks. However, the low noise tolerance of CAMs is often a serious problem. A projection learning rule with large constant terms improves the noise tolerance of CAMs. However, the projection learning rule can be applied only to CAMs with full connections. In this paper, we propose a gradient descent learning rule with large constant terms, which is not restricted by network topology. We realize large constant terms by regularization to connection weights. By computer simulations, we prove that the proposed learning algorithm improves noise tolerance. © 2016 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call