Abstract

The theory and implementation of extreme learning machine (ELM) prove that it is a simple, efficient, and accurate machine learning method. Compared with other single hidden layer feedforward neural network algorithms, ELM is characterized by simpler parameter selection rules, faster convergence speed, and less human intervention. The multiple hidden layer regularized extreme learning machine (MRELM) inherits these advantages of ELM and has higher prediction accuracy. In the MRELM model, the number of hidden layers is randomly initiated and fixed, and there is no iterative tuning process. However, the optimal number of hidden layers is the key factor to determine the generalization ability of MRELM. Given this situation, it is obviously unreasonable to determine this number by trial and random initialization. In this paper, an incremental MRELM training algorithm (FC-IMRELM) based on forced positive-definite Cholesky factorization is put forward to solve the network structure design problem of MRELM. First, an MRELM-based prediction model with one hidden layer is constructed, and then a new hidden layer is added to the prediction model in each training step until the generalization performance of the prediction model reaches its peak value. Thus, the optimal network structure of the prediction model is determined. In the training procedure, forced positive-definite Cholesky factorization is used to calculate the output weights of MRELM, which avoids the calculation of the inverse matrix and Moore-Penrose generalized inverse of matrix involved in the training process of hidden layer parameters. Therefore, FC-IMRELM prediction model can effectively reduce the computational cost brought by the process of increasing the number of hidden layers. Experiments on classification and regression problems indicate that the algorithm can be effectively used to determine the optimal network structure of MRELM, and the prediction model training by the algorithm has excellent performance in prediction accuracy and computational cost.

Highlights

  • The neural network is a complex nonlinear system interconnected by a large number of neurons, and it is based on the research of human brain information processing ability by modern neurobiology and cognitive science

  • The above prediction models are all based on the traditional neural networks, and the network training process needs to modify the network weights repeatedly according to the training objectives and gradient information

  • In order to realize the effective design of multiple hidden layer regularized extreme learning machine (MRELM) network structure, select the number of hidden layers reasonably, and achieve the desired accuracy requirements, an incremental MRELM training algorithm based on forced positive definite Cholesky factorization (FC-IMRELM) [23, 24] is put forward in this paper

Read more

Summary

Introduction

The neural network is a complex nonlinear system interconnected by a large number of neurons, and it is based on the research of human brain information processing ability by modern neurobiology and cognitive science. Compared with the traditional RELM model, MRELM can effectively improve the prediction accuracy through the layer-by-layer optimization of network parameters between different hidden layers. It has the advantages of strong generalization ability and fast computing speed and is not easy to fall into local optimum [19,20,21,22]. In order to realize the effective design of MRELM network structure, select the number of hidden layers reasonably, and achieve the desired accuracy requirements, an incremental MRELM training algorithm based on forced positive definite Cholesky factorization (FC-IMRELM) [23, 24] is put forward in this paper. The rest of this paper is organized as follows: Section 2 presents a brief review of the basic concepts and related work of multiple hidden layers RELM, Section 3 describes the proposed incremental FC-IMRELM technique, Section 4 reports and analyzes the experimental results, and, Section 5 summarizes key conclusions of the present study

Brief Review of Multiple Hidden Layers Regularized Extreme Learning Machine
Solutions of IMRELM by the Forced Positive-Definite Cholesky Factorization
Results and Discussion
Classification Problems
60 Banknote Blood Diabetic Image Wilt Coal Iron
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call