Abstract

This paper concerns estimation of generalization performance of the multi-output extreme learning machine classifier (M-ELM) in the framework of statistical learning theory. The performance bound is derived under the assumption that the expectation of the extreme learning machine kernel exists. We first show that minimizing the least square error is equal to minimizing an upper bound of the error concerning the margin of M-ELM in the training set, which implies that M-ELM ends up with high confidence after training. Afterwards, we derive the bound based on the margin of M-ELM and the empirical Rademacher complexity. The bound not only gives a theoretical explanation of good performance of M-ELM especially in the small-sample cases, but also shows that the performance of M-ELM is insensitive to the number of hidden nodes, which is consistent with previous experimental results. The bound also offers an insight that the performance of M-ELM is not significantly affected by the number of classes, which proves the effectiveness of the learning process of M-ELM.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.