Abstract

The use of numerical prediction models are essential to modern society. Data assimilation is a technique that aims to increase the prediction accuracy by combining a model output with observational data, resulting in a state that is closer to the true state of the problem. Depending on the size of the model output and the number of observations to assimilate, the combination of these two sources of information may require intensive computing and become a challenge, even for supercomputers used in this type of application. Thus neural networks have been proposed as an alternative to perform high quality data assimilation at lower computational cost. This paper investigates the use of NeuroEvolution of Augmenting Topologies (NEAT) in data assimilation. NEAT is capable of adapting the connections weights and the neural network topology using principles of evolutionary computation in a search for a minimum topology and best performance. In this work, two different models were used for testing: the Lorenz Attractor and Shallow Water model. The experiments compared the results obtained with NEAT and backpropagation neural networks, using as benchmark the Best Linear Unbiased Estimator (BLUE). In the experiment with the Lorenz Attractor, NEAT was able to emulate the data assimilation task with smaller error at lower computational cost. For the Shallow Water model, tested using different grid sizes, it was observed that the errors obtained with both neural networks were small, but NEAT showed high error values. On the other hand, NEAT always gets a topology with significantly fewer operations, and the computational cost difference increases with the grid size.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call