Abstract

The existing inverse-free incremental learning algorithm for the regularized extreme learning machine (ELM) was based on an inverse-free algorithm to update the regularized pseudo-inverse, which was deduced from an inverse-free recursive algorithm to update the inverse of a Hermitian matrix. Before that recursive algorithm was applied in the existing inverse-free ELM, its improved version had been utilized in previous literatures. Then from the improved recursive algorithm to update the inverse, we deduce a more efficient inverse-free algorithm to update the regularized pseudo-inverse, from which we propose the inverse-free incremental ELM algorithm based on regularized pseudo-inverse. Usually the above-mentioned inverse is smaller than the pseudo-inverse, while in the processor units with limited precision, the recursive algorithm to update the inverse may introduce numerical instabilities. Then to further reduce the computational complexity, we also propose the inverse-free incremental ELM algorithm based on the ${\mathrm {LDL}}^{T}$ factors of the inverse, where the ${\mathrm {LDL}}^{T}$ factors are updated iteratively by the inverse ${\mathrm {LDL}}^{T}$ factorization. With respect to the existing inverse-free ELM, the proposed ELM based on regularized pseudo-inverse and that based on ${\mathrm {LDL}}^{T}$ factors are expected to require only $\frac {3}{8+M}$ and $\frac {1}{8+M}$ of complexities, respectively, where $M$ is the output node number. The numerical experiments show that both the proposed ELM algorithms significantly accelerate the existing inverse-free ELM, and the speedup in training time is not less than 1.41. On the Modified National Institute of Standards and Technology (MNIST) Dataset, usually the proposed algorithm based on ${\mathrm {LDL}}^{T}$ factors is much faster than that based on regularized pseudo-inverse. On the other hand, in the numerical experiments, the original ELM, the existing inverse-free ELM and the proposed two ELM algorithms achieve the same performance in regression and classification, and result in the same solutions, which include the output weights and the output sequence for the same input sequence.

Highlights

  • The extreme learning machine (ELM) [1] is an effective solution for single-hidden-layer feedforward networks (SLFNs) due to its unique characteristics, i.e., extremely fast learning speed, good generalization performance, and universal approximation capability [2]

  • We utilize the improved recursive algorithm [15], [16] that updates the inverse, to deduce an inverse-free algorithm to update the regularized pseudoinverse, which is more efficient than the corresponding algorithm utilized in the existing inverse-free incremental learning algorithm [4] for the regularized ELM

  • We propose the inverse-free incremental ELM algorithm based on regularized pseudo-inverse, which reduces the computational complexity of the existing inverse-free incremental

Read more

Summary

INTRODUCTION

The extreme learning machine (ELM) [1] is an effective solution for single-hidden-layer feedforward networks (SLFNs) due to its unique characteristics, i.e., extremely fast learning speed, good generalization performance, and universal approximation capability [2]. The inverse-free incremental ELM algorithm based on regularized pseudo-inverse was proposed in [4] to update the output weights of the added node and the existing nodes. The ELM algorithm proposed in [4] was based on an inverse-free algorithm to update the regularized pseudo-inverse of the hidden layer output matrix. The incremental ELM algorithm based on generalized inverse (i.e., pseudoinverse) was proposed in [5], which can only update the pseudo-inverse of the hidden layer output matrix recursively, and cannot be applied in the regularized ELM. 0l is the l × 1 zero column vector, while Il is the identity matrix of size l

ARCHITECTURE OF THE ELM
THE PROPOSED INCREMENTAL ELM UPDATING REGULARIZED PSEUDO-INVERSE
THE PROPOSED INCREMENTAL ELM ALGORITHM BASED ON LDLT FACTORS
THE ELM ALGORITHM UPDATING THE UNIQUE
THE PROPOSED INCREMENTAL ELM UPDATING LDLT FACTORS OF THE UNIQUE INVERSE
COMPLEXITY ANALYSIS AND NUMERICAL EXPERIMENTS
COMPLEXITY ANALYSIS OF THE PRESENTED
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call