Abstract

A time series contain a large amount of information suitable for forecasting. Classical statistical and recent deep learning models have been widely used in a variety of forecasting applications. During the training data preparation stage, most models collect samples by sliding a fixed-sized window over the time axis of the input time series. We refer to this conventional method as “sparse sampling” because it cannot extract sufficient samples because it ignores another important axis representing the window size. In this study, a dense sampling method is proposed that extends the sampling space from one to two dimensions. The new space consists of time and window axes. Dense sampling provides several desirable effects, such as a larger training dataset, an intra-model ensemble, model-agnosticism, and an easier setting of the optimal window. The experiments were conducted using four real datasets: Bitcoin price, influenza-like illness, household electric power consumption, and wind speed. The mean absolute percentage error was measured extensively in terms of varying window sizes, horizons, and lengths of time series. The resulting data showed that dense sampling significantly and consistently outperformed sparse sampling. The source codes and datasets are available at https://github.com/isoh24/Dense-sampling-time-series.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.