Abstract

AbstractThe k‐nearest neighbors algorithm is one of the prominent techniques used in classification and regression. Despite its simplicity, the k‐nearest neighbors has been successfully applied in time series forecasting. However, the selection of the number of neighbors and feature selection is a daunting task. In this paper, we introduce two methodologies for forecasting time series that we refer to as Classical Parameters Tuning in Weighted Nearest Neighbors and Fast Parameters Tuning in Weighted Nearest Neighbors. The first approach uses classical parameters tuning that compares the most recent subsequence with every possible subsequence from the past of the same length. The second approach reduces the neighbors' search set, which leads to significantly reduced grid size and hence a lower computational time. To tune the models' parameters, both methods implement an approach inspired by cross‐validation for weighted nearest neighbors. We evaluate the forecasting performance and accuracy of our models. Then, we compare them to other approaches, especially, Seasonal Autoregressive Integrated Moving Average, Holt Winters, and Exponential Smoothing State Space Model. Real data examples on retail and food services sales in the United States and milk production in the United Kingdom are analyzed to demonstrate the application and efficiency of the proposed approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call