Abstract
A new learning algorithm called extreme learning machine (ELM) has recently been proposed for single-hidden layer feedforward neural networks (SLFNs) to easily achieve good generalization performance at extremely fast learning speed. ELM randomly chooses the input weights and analytically determines the output weights of SLFNs. This paper shows that ELM can be extended to radial basis function (RBF) network case, which allows the centers and impact widths of RBF kernels to be randomly generated and the output weights to be simply analytically calculated instead of iteratively tuned. Interestingly, the experimental results show that the ELM algorithm for RBF networks can complete learning at extremely fast speed and produce generalization performance very close to that of SVM in many artificial and real benchmarking function approximation and classification problems. Since ELM does not require validation and human-intervened parameters for given network architectures, ELM can be easily used.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.