Abstract

Utilizing a temperature time-series prediction model to achieve good results can help us to accurately sense the changes occurring in temperature levels in advance, which is important for human life. However, the random fluctuations occurring in a temperature time series can reduce the accuracy of the prediction model. Decomposing the time-series data prior to performing a prediction can effectively reduce the influence of random fluctuations in the data and consequently improve the prediction accuracy results. In the present study, we propose a temperature time-series prediction model that combines the seasonal-trend decomposition procedure based on the loess (STL) decomposition method, the jumps upon spectrum and trend (JUST) algorithm, and the bidirectional long short-term memory (Bi-LSTM) network. This model can achieve daily average temperature predictions for cities located in China. Firstly, we decompose the time series into trend, seasonal, and residual components using the JUST and STL algorithms. Then, the components determined by the two methods are combined. Secondly, the three components and original data are fed into the two-layer Bi-LSTM model for training purposes. Finally, the prediction results achieved for both the components and original data are merged by learnable weights and output as the final result. The experimental results show that the average root mean square and average absolute errors of our proposed model on the dataset are 0.2187 and 0.1737, respectively, which are less than the values 4.3997 and 3.3349 attained for the Bi-LSTM model, 2.5343 and 1.9265 for the EMD-LSTM model, and 0.9336 and 0.7066 for the STL-LSTM model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call