Abstract

Complex-valued neural networks, which are extensions of ordinary neural networks, have been studied as interesting models by many researchers. Especially, complex-valued Hopfield neural networks (CHNNs) have been used to process multilevel data, such as gray-scale images. CHNNs with Hermitian connection weights always converge using asynchronous update. The noise tolerance of CHNNs deteriorates extremely as the resolution increases. Noise tolerance is one of the most controversial problems for CHNNs. It is known that rotational invariance reduces noise tolerance. In this brief, we propose symmetric CHNNs (SCHNNs), which have symmetric connection weights. We define their energy function and prove that the SCHNNs always converge. In addition, we show that the SCHNNs improve noise tolerance through computer simulations and explain this improvement from the standpoint of rotational invariance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call