Abstract

Context. Neural Ordinary Differential Equations is a deep neural networks family that leverage numerical methods approaches for solving the problem of time series reconstruction, given small amount of unevenly distributed samples.
 Objective. The goal of the following research is the synthesis of a deep neural network that is able to solve input signal reconstruction and time series extrapolation task.
 Method. The proposed method exhibits the benefits of solving time series extrapolation task over forecasting one. A model that implements encoder-decoder architecture with differential equation solving in latent space, is proposed. The latter approach was proven to demonstrate outstanding performance in solving time series reconstruction task given a small percentage of noisy and uneven distributed input signals. The proposed Latent Ordinary Differential Equations Variational Autoencoder (LODE-VAE) model was benchmarked on synthetic non-stationary data with added white noise and randomly sampled with random intervals between each signal.
 Results. The proposed method was implemented via deep neural network to solve time series extrapolation task.
 Conclusions. The conducted experiments have confirmed that proposed model solves the given task effectively and is recommended to apply it to solving real-world problems that require reconstructing dynamics of non-stationary processes. The prospects for further research may include the process of computational optimization of proposed models, as well as conducting additional experiments involving different baselines, e. g. Generative Adversarial Networks or attention Networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call