Abstract

A self-organizing map (SOM) is a competitive learning neural network architecture that uses a set of classificatory neurons, which self-organize spatially based on input patterns. The application of standard SOM, and its variants, to time series prediction has been studied in the literature under different conditions and techniques. In this paper, we have focused on the standard Self-Organizing Maps, and its application for time series forecasting. We provide extensive numerical analysis considering regular and complex topologies, and real time series. Concretely, we study the impact of the number of neurons, the effect of the best-matching unit over its neighborhood, the use of nonlinear learning rate functions, and the importance of a proportional training together with a sampled input space as uniformly as possible. We have found that probabilistic updating in neighborhoods may be used as a second learning rate parameter within the SOM. We have also found that small-world and scale-free topologies are able to improve the error of regular lattices, depending on their own set of parameters, the number of neurons, and the length of the training set. Finally, we have found coherence between results of real time series and analytic benchmarks for forecasting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call