Abstract

Ensemble deep learning can combine strengths of neural network and ensemble learning and gradually becomes a new emerging research direction. However, the existing methods either lack theoretical support or demand large integrated models. To solve these problems, in this paper, Ensembles of Gradient Boosting Recurrent Neural Network (EGB-RNN) is proposed, which combines the gradient boosting ensemble framework with three types of recurrent neural network models, namely Minimal Gated Unit (MGU), Gated Recurrent Unit (GRU) and Long Short-Term Memory (LSTM). RNN model is used as base learner to integrate an ensemble learner, through the way of gradient boosting. Meanwhile, for ensuring the ensemble model fit data better, Step Iteration Algorithm is designed to find an appropriate learning rate before models being integrated. Contrast trials are carried out on four time series data sets. Experimental results demonstrate that with the number of integration increasing, the performance of three types of EGB-RNN models tend to converge and the best EGB-RNN model and the best degree of ensemble vary with data sets. It is also shown in statistical results that the designed EGB-RNN models perform better than six baselines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call