Abstract

The prediction of future values of a time series generated by a chaotic dynamical system is an extremely challenging task. Amongst several nonlinear models employed for the prediction of chaotic time series, artificial neural networks (ANNs) have gained major attention in the past decade. One widely recognized aspect of ANN design in order to achieve sufficient prediction performance is the structure of the network. We automatize this procedure by evolving ANN topologies of low complexity, guiding the evolutionary process towards ANNs of increased generalization ability. Specifically, a genetic algorithm (GA) is utilized to construct the architecture of generalized multi-layer perceptrons (GMPs) trained by error backpropagation. Another less investigated but important factor of ANN prediction quality is the size and composition of the training data set (TDS). Henceforth, we subject the selection of training data to artificial evolution in the environment of an ANN with fixed structure. A natural way to exploit the mutual dependencies of ANN structures and TDSs is symbiotic (cooperative) coevolution, where the fitness of an ANN is equally credited to the TDS it has been trained with. We compare these methods (ANN evolution, TDS evolution, and coevolution) with a standard ANN architecture given in the literature by predicting the Mackey-Glass time series.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call