Abstract

Time series forecasting can provide more extensive data support for good decision making. Recently, many deep learning based forecasting models have been proposed, and thus the main problems are how to learn effective historical information and alleviate the influence of error propagation, especially for long-term prediction. In this paper, we present a new attention model base on LSTM encoder–decoder architecture to predict long-term time series. We define a similar scenes of time series, which include periodic pattern and time-nearest pattern, and provide a similar scenes search method. Base on this, we design a hybrid time aligned and context attention model (HTC-Attn), and the former focus on the characteristics of the alignment position, while the latter focus on the context features of specific location in similar scenes. The attention gate is designed to control the absorption degree of two different types in the prediction model. Furthermore, the proposed model use double-layer encoder–decoder structure to learn the trend term and time dependence of time series. Experimental results show that HTC-Attn can effectively maintain long-term dependence and learn detailed in single factor time series prediction tasks, and accuracy consistently outperforms the state-of-the-art baselines at least 2%.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call