Abstract

Complex-valued neural network is a kind of learning model which can deal with problems in complex domain. Fully complex extreme learning machine (CELM) is a much faster training algorithm than the complex backpropagation (CBP) scheme. However, it is at the cost of using more hidden nodes to obtain the comparable performance. An upper-layer-solution-aware algorithm has been proposed for training single-hidden layer feedforward neural networks, which performs much better than its counterparts, pseudo-inverse learning (PIL)/extreme learning machine and gradient decent-based backpropagation neural networks. Consequently, there exist two challenges that need to be dealt with: 1) How to combine the advantages of CBP and CELM to develop a novel complex learning algorithm? and 2) What is the convergent behavior of the presented algorithm? In this article, an input weights dependent complex-valued (IWDCV) learning algorithm based on Wirtinger calculus has been proposed, which effectively solves the nonanalytic problem of the common activation functions during training neural networks. In addition, the monotonicity of the error function and the deterministic convergence of the proposed model have been strictly proved, which theoretically guarantee the efficiency and effectiveness of the given model, IWDCV. Finally, for real and complex-valued problems, a variety of simulations have been done to demonstrate the comparable performance of the proposed algorithm which support the theoretical observations as well.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call