Abstract

In this article we discussed a method to discretize multivariate time series using the recurrent neural network (RNN). Time series discretization is a technique to convert real-number time series into an array of discrete elements, which is essential for downstream applications where a finite number of inputs is required such as sequence prediction and classification. While there had existed several sophisticated algorithms to compute discretization, we looked at RNN to investigate the potential of the architecture in solving this problem. We employed Long Short-Term Memory (LSTM) network which had the ability to memorize the information and carry it over an arbitrary range of time steps by introducing cell states in the recursion. This made the final cell state a plausible representation of the time series with arbitrary dimensions and decoupled from time. The cell state might serve as an input for subsequent clustering algorithms to generate clusters. The clusters could serve as arbitrary elements and the original time series could be represented as a procession of cluster groups. An outline of the learning procedure was as followed. From a time series of length N, a collection of segments in length t was obtained. The segments were fed to an LSTM RNN to learn to predict the value at $\mathrm{t}+1$. After learning the segments were fed into the network again and the final cell states for each segment were collected. The final cell state acted as a representative vector of the time series. Then k-means clustering was applied to assign each state into a finite number of clusters. The output for time series in length N would be a sequence of categorical elements in length $\mathrm{N}-\mathrm{t}+1$. We would demonstrate this operation on discretizing artificial datasets and evaluate the performance of this methods with two variants.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.