Abstract

The combination of Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs) have played important roles in deep learning in recent years to improve the prediction performance, especially in the context of temporal data analysis. Previous research has shown that certain time series could have common time-dependent characteristics. Therefore, in order to make good prediction, it is necessary to take into account the correlation between different temporal data in modeling. However, general RNN models have serious limitation to achieve this goal. In this paper, a new architecture, Deep and Wide Neural Networks (DWNN), is proposed, where CNN's convolution layer is added to the RNN's hidden state transfer process. CNN is combined with RNN to extract the correlation characteristics of different RNN models while RNNs running along the time steps. This new architecture not only has the depth of RNN in the time dimension, but also has the width of the number of temporal data. The intuition behind the DWNN model, as well as different kinds of DWNN model structures are discussed in this paper. We use stock data from the sandstorm sector of Shanghai Stock Exchange for our experiment. As shown in the result, our proposed DWNN model can reduce the prediction mean squared error by 30% compared with the general RNN model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call