Abstract

This paper implements the inverse approach for forecasting hydrological time series in an efficient way using a micro-GA (mGA) search engine. The inverse approach is based on chaos theory and it involves: (1) calibrating the delay time ((), embedding dimension (m) and number of nearest neighbors (k) simultaneously using a single definite criterion, namely optimum prediction accuracy, (2) verifying that the optimal parameters have wider applicability outside the scope of calibration, and (3) demonstrating that chaotic behaviour is present when optimal parameters are used in conjunction with existing system characterization tools. The first stage is conducted efficiently by coupling the Nonlinear Prediction (NLP) method with mGA using a lookup facility to eliminate costly duplicate NLP evaluations. The mGA-NLP algorithm is applied to a theoretical chaotic time series (Mackey–Glass) and a real hydrological time series (Mississippi river flow at Vicksburg) to examine its efficiency. Results show that: (1) mGA is capable of producing comparable or superior triplets using only up to 5% of the computational effort of all possible points in the search space, (2) the lookup facility is very cost-effective because only about 50% of the triplets generated by mGA are distinct, (3) mGA seems to produce more robust solutions in the sense that the record length required to achieve a stable optimum triplet is much shorter, and (4) the prediction accuracy is not sensitive to the parameter k. It is sufficient to use k = 10 in future studies. In this way, the 3D search space could be reduced to a much smaller 2D search space of m and τ.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call