Abstract

In this work we consider the problem of analyzing and predicting time series data using a Bag-of-Functions approach by a self supervised autoencoder. Particularly, by means of deep neural networks, we define a latent space of multivariate time series data as the parameterization for a bag of multivariate functions. Specifically, the latent space encoding represents a set of parameters for the bag of functions as well as a top-k distribution that selects the functions most likely to represent the data sequence. The approach bears some intended similarities to well known approaches from natural language processing and machine translation where first a sparse representation of words is learned and second these sparse representations are stored in a bag-of-words or embeddings. To underline the performance and its fast capability of adaption, we first perform a pretraining task on synthetic data. Afterwards we use transfer learning to apply the network on the M4 benchmark dataset and gain suitable reconstructions on certain forecasters over multiple horizons without any significant loss of performance. Tests on a new energy supply dataset show interesting results in terms of unsupervised time series analysis and decomposition, while the trajectories always remain fully interpretable. In all cases the approach learns its own way of decomposing and describing time series and easily adapts to very different courses.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call