Abstract

Artificial neural networks in general and radial basis function in particular are known for high accuracies in function approximation, nonlinear system identification, and pattern classification problems; however, they pose numerous challenges with regards to the optimality of parameters involved. This paper proposes the use of a classical Nelder–Mead simplex method to optimize the parameters of activation function implicit in the design of radial basis function network. The key advantage of using Nelder–Mead simplex method lies in the fact that it provides a simple yet effective derivative-free approach for the numerical optimization of scalar variables such as spread and learning rate for Kernels of the radial basis function network. We thus present a novel hybrid algorithm in which weights of neurons are updated using gradient decent approach, while spread and learning rate is updated, viz. the Nelder–Mead simplex method. In results, the efficiency of proposed algorithm is statistically compared with the existing algorithms in different applications such as classification of digital signals in noise-limited wireless communication system, synthesis of microstrip patch antenna, and curve fitting problem. Lastly, we consider a two-variable function approximation problem to pedagogically express contrasting features of the hybrid algorithm, thereby pointing toward its potential usage in some engineering design problems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call