Abstract

Deep learning models have demonstrated outstanding forecasting effects and are extensively used in forecasting problems in numerous scenarios. The Long short-term memory neural network (LSTM), a deep learning variant, has huge potential in forecasting the forthcoming values of given patterns using historical time series data. Lately, the forecasting task has captivated the focus of researchers in deep learning areas to handle the constraints of traditional statistical approaches. Owing to the dire need for accurate forecasting and also due to technological advancements, the collection of time series data has become more accessible, and this has paved the way for deep forecasting models. In this paper, we propose the Stacked Bi-LSTM (SBiLSTM) architecture, an adaptation of the traditional deep long-short term memory (TDLM). The approach is evaluated using two oilfield production time series. The performance of the proposed SBiLSTM model is compared with the recurrent neural network-RNNs, multi-layer RNNs, deep gated recurrent unit (DGRU), and deep long-short term memory (DLSTM). Using different measurement criteria, the empirical results show that the proposed DLSTM model outperforms other standard approaches. It is observed that the proposed SBiLSTM model acquires long and short-range interdependent features of univariate time series data without the large memory requirement.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call