Abstract

As an extensively used model for time series prediction, the Long–Short Term Memory (LSTM) neural network suffers from shortcomings such as high computational cost and large memory requirement, due to its complex structure. To address these problems, a PLS-based pruning algorithm is hereby proposed for a simplified LSTM (PSLSTM). First, a hybrid strategy is designed to simplify the internal structure of LSTM, which combines the structure simplification and parameter reduction for gates. Second, partial least squares (PLS) regression coefficients are used as the metric to evaluate the importance of the memory blocks, and the redundant hidden layer size is pruned by merging unimportant blocks with their most correlated ones. The Backpropagation Through Time (BPTT) algorithm is utilized as the learning algorithm to update the network parameters. Finally, several benchmark and practical datasets for time series prediction are used to evaluate the performance of the proposed PSLSTM. The experimental results demonstrate that the PLS-based pruning algorithm can achieve the trade-off between a good generalization ability and a compact network structure. The computational complexity is improved by the simple internal structure as well as the compact hidden layer size, without sacrificing prediction accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call