Abstract

The study presented in this paper aims to improve the accuracy of meteorological time series predictions made with the recurrent neural network known as Long Short-Term Memory (LSTM). To reach this, instead of just making adjustments to the architecture of LSTM as seen in different related works, it is proposed to adjust the LSTM results using the univariate time series imputation algorithm known as Local Average of Nearest Neighbors (LANN) and LANNc which is a variation of LANN, that allows to avoid the bias towards the left of the synthetic data generated by LANN. The results obtained show that both LANN and LANNc allow to improve the accuracy of the predictions generated by LSTM, with LANN being superior to LANNc. Likewise, on average the best LANN and LANNc configurations make it possible to outperform the predictions reached by another recurrent neural network known as Gated Recurrent Unit (GRU).

Highlights

  • Forecasting is one of the most exciting subfields in the field of time series

  • The Local Average of Nearest Neighbors (LANN) and LANNc algorithms were implemented with different configurations of NA values between 1 and 11 as it shown in Table II and Table III with the respective Root Mean Squared Error (RMSE) values

  • The use of Imputation techniques based on Local Average of Nearest Neighbors allowed to improve the prediction results of Long Short-Term Memory (LSTM) by exceeding on average the prediction results of Gated Recurrent Unit (GRU) and other state of the art techniques

Read more

Summary

Introduction

Forecasting is one of the most exciting subfields in the field of time series. Since the beginning, forecasting techniques have evolved greatly from simple linear regressions, passing for moving averages, autoregressive models, machine learning models, until reach Deep Learning [1] techniques.Within Deep learning, for forecasting activities, recurrent neural networks are very common, and within them Long Short-Term Memory [2] and the Gated Recurrent Unit [3].Long Short-Term Memory in many forecasting works has been used successfully, and the changes implemented to improve or reduce the error rate mainly includes input adjustments, tunning of parameters, number of layers, training epochs, etc.After analyzing and evaluate the prediction results of an LSTM model in a 4-year meteorological time series corresponding to maximum temperatures, it was observed that various synthetic values could better approximate their real values through imputation processes. Within Deep learning, for forecasting activities, recurrent neural networks are very common, and within them Long Short-Term Memory [2] and the Gated Recurrent Unit [3]. A recurrent neural network is a neural network model for modeling time series [2] The structure of this type of network is very similar to that of a standard multilayer perceptron (MLP), with the difference that it allows connections between hidden units associated with a time delay. Through these connections, the model can retain information from the past [18], allowing it to discover temporal correlations between events that may be very far from each other. Recurrent neural networks are difficult to train [2] due to the problems of vanishing and exploding gradients, these problems resulted in the creation of LSTM networks

Objectives
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call