Reservoir computing, a new method of machine learning, has recently been used to predict the state evolution of various chaotic dynamic systems. It has significant advantages in terms of training cost and adjusted parameters; however, the prediction length is limited. For classic reservoir computing, the prediction length can only reach five to six Lyapunov times. Here, we modified the method of reservoir computing by adding feedback, continuous or discrete, to “calibrate” the input of the reservoir and then reconstruct the entire dynamic systems. The reconstruction length appreciably increased and the training length obviously decreased. The reconstructing of dynamical systems is studied in detail under this method. The reconstruction can be significantly improved both in length and accuracy. Additionally, we summarized the effect of different kinds of input feedback. The more it interacts with others in dynamical equations, the better the reconstructions. Nonlinear terms can reveal more information than linear terms once the interaction terms are equal. This method has proven effective via several classical chaotic systems. It can be superior to traditional reservoir computing in reconstruction, provides new hints in computing promotion, and may be used in some real applications.
Read full abstract