Abstract

In a complex-valued phasor Hopfield neural network with a training pattern, only the rotated patterns are fixed points, while in a complex-valued K-state Hopfield neural network, there exist K fixed points. In the case of a quaternionic Hopfield neural network (QHNN) with a continuous activation function, again only the rotated patterns are fixed points. We consider a QHNN with a split activation function, which is a 16-state activation function. This type of QHNN is referred to as a split QHNN (SQHNN). It is expected to have 16 fixed points, all of which are global minima. The rate at which the training pattern would be recalled from random initial states would thus be expected to be 1/16. However, the rate was higher in our computer simulations. We investigate the reasons for this discrepancy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call