Abstract

For ensemble learning, how to select and combine the candidate classifiers are two key issues which influence the performance of the ensemble system dramatically. Random vector functional link networks (RVFL) without direct input-to-output links is one of suitable base-classifiers for ensemble systems because of its fast learning speed, simple structure and good generalization performance. In this paper, to obtain a more compact ensemble system with improved convergence performance, an improved ensemble of RVFL based on attractive and repulsive particle swarm optimization (ARPSO) with double optimization strategy is proposed. In the proposed method, ARPSO is applied to select and combine the candidate RVFL. As for using ARPSO to select the optimal base RVFL, ARPSO considers both the convergence accuracy on the validation data and the diversity of the candidate ensemble system to build the RVFL ensembles. In the process of combining RVFL, the ensemble weights corresponding to the base RVFL are initialized by the minimum norm least-square method and then further optimized by ARPSO. Finally, a few redundant RVFL is pruned, and thus the more compact ensemble of RVFL is obtained. Moreover, in this paper, theoretical analysis and justification on how to prune the base classifiers on classification problem is presented, and a simple and practically feasible strategy for pruning redundant base classifiers on both classification and regression problems is proposed. Since the double optimization is performed on the basis of the single optimization, the ensemble of RVFL built by the proposed method outperforms that built by some single optimization methods. Experiment results on function approximation and classification problems verify that the proposed method could improve its convergence accuracy as well as reduce the complexity of the ensemble system.

Highlights

  • Neural network ensemble (NNE) is a learning mechanism which has a collection of a finite number of neural networks trained for the same task [1]

  • Based on the E-ARPSOELM, we proposed a diversity guided ensemble of extreme learning machine (ELM) based on ARPSO (DGEELMBARPSO) [38] which used ARPSO to select the base ELM from the initial ELM pool by considering both the classification accuracy and diversity of the ensemble system represented by each particle

  • To verify the effectiveness of the proposed approach, the DO-EELM is compared with ensemble of ELM (E-ELM) [18], ensemble of online sequential extreme learning machine (EOS-ELM) [20], E-PSOELM, E-ARPSOELM [37] and DGEELMBARPSO [38] on seven datasets

Read more

Summary

Introduction

Neural network ensemble (NNE) is a learning mechanism which has a collection of a finite number of neural networks trained for the same task [1]. A modified ARPSO is employed to select the base ELM from the initial ELM pool by considering the convergence accuracy and diversity of the candidate ensemble system, which is the same to the DGEELMBARPSO method. To obtain the optimal ensemble weights, the initial weights of the selected base ELM are determined by the minimum norm LS method, and are further optimized by ARPSO.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call