Abstract

A Hopfield neural network (HNN) is a neural network model with mutual connections. A quaternionic HNN (QHNN) is an extension of HNN. Several QHNN models have been proposed. The hybrid QHNN utilizes the non-commutativity of quaternions. It has been shown that hybrid QHNNs with the Hebbian learning rule outperformed QHNNs in noise tolerance. The Hebbian learning rule, however, is a primitive learning algorithm, and it is necessary that we study advanced learning algorithms. Although the projection rule is one of a few promising learning algorithms, it is restricted by network topologies and cannot be applied to hybrid QHNNs. In the present work, we propose gradient descent learning, which can be applied to hybrid QHNNs. We compare the performance of gradient descent learning with that of projection rule. Results showed that the gradient descent learning outperformed projection rule in noise tolerance in computer simulations. For small training-pattern sets, hybrid QHNNs with gradient descent learning produced the best performances. QHNNs did so for large training-pattern set. In future, gradient descent learning will be extended to QHNNs with different network topology and activation function.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call