Abstract

For optimization algorithms of fully complex-valued neural networks, complex-valued stepsize is helpful to make the training escape from saddle points. In this paper, an adaptive orthogonal gradient descent algorithm with complex-valued stepsize is proposed for the efficient training of fully complex-valued neural networks. The basic idea is that, at each iteration, the search direction is constructed as a combination of two orthogonal gradient directions by using the algebraic representation of complex-valued stepsize. It is then shown that the determination of suitable complex-valued stepsize is facilitated by a decoupling method such that the computational complexity involved in the training process is greatly reduced. The experiments are finally conducted on pattern classification, nonlinear channel equalization and signal prediction to confirm the advantages of the proposed algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call