Abstract
This paper proposes supervised learning algorithms based on gradient descent for training reformulated radial basis function (RBF) neural networks. Such RBF models employ radial basis functions whose form is determined by admissible generator functions. RBF networks with Gaussian radial basis functions are generated by exponential generator functions. A sensitivity analysis provides the basis for selecting generator functions by investigating the effect of linear, exponential and logarithmic generator functions on gradient descent learning. Experiments involving reformulated RBF networks indicate that the proposed gradient descent algorithms guarantee fast learning and very satisfactory function approximation capability.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.