Abstract

In this research, we propose a novel fractional gradient descent-based learning algorithm (FGD) for the radial basis function neural networks (RBF-NN). The proposed FGD is the convex combination of the conventional, and the modified Riemann–Liouville derivative-based fractional gradient descent methods. The proposed FGD method is analyzed for an optimal solution in a system identification problem, and a closed form Wiener solution of a least square problem is obtained. Using the FGD, the weight update rule for the proposed fractional RBF-NN (FRBF-NN) is derived. The proposed FRBF-NN method is shown to outperform the conventional RBF-NN on four major problems of estimation namely nonlinear system identification, pattern classification, time series prediction and function approximation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.