Complex-valued neural networks (CVNNs) have become a powerful modelling tool for complex-valued data processing. Because most of the critical points of CVNNs are saddle points, the gradient-based learning algorithms for CVNNs enjoy more chances to reach the global minima while suffering from slow convergence. To this end, we propose a hybrid complex spectral conjugate gradient learning algorithm for fast training CVNNs in this paper. The proposed algorithm combines the scaled negative gradient with a Barzilai–Borwein stepsize and an optimized conjugate term to define a new training direction, thus providing an accurate approximation of the second-order curvature of the objective function. The complex Wolfe conditions are employed to adaptively determine the optimal training stepsize. Under mild conditions, the descent property of the training direction and the convergence of the proposed algorithm are theoretically established. Simulation results on a number of benchmark complex-valued data processing problems demonstrate the efficiency of the proposed algorithm.
Read full abstract