Abstract

In this paper, a new class of learning models, namely the additive radial basis function networks (ARBFNs) for general nonlinear regression problems are proposed. This class of learning machines combines the radial basis function networks (RBFNs) commonly used in general machine learning problems and the additive models (AMs) frequently encountered in semiparametric regression problems. In statistical regression theory, AM is a good compromise between the linear parametric model and the nonparametric model. Simulation results show that for the given learning problem, ARBFNs usually need fewer hidden nodes than those of RBFNs for the same level of accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.