Abstract

In many cases, the equations that dynamical systems are based on are unknown and hard to model and predict. On the other hand, machine learning algorithms are based on the data of a solution as it evolves and do not need equations. In the era of abundant data, using machine learning technology to discover accurate mathematical models of dynamical systems directly from time series data becomes increasingly important. Recently, a multi-step deep neural networks (multi-step DNN) model without need of direct access to temporal gradients is proposed, which can accurately learn the evolution from a given set of observed data, identify nonlinear dynamical systems, and forecast future states. However, the architecture lacks the capability to capture long term temporal dependencies from dynamical time-series data. In the paper, based on the multi-step time-stepping schemes, we proposed a new CLDNN model which combine convolutional layer, long short-term memory layer and fully connected layer, to address the aforementioned weakness. The effectiveness of our model is tested for several benchmark problems involving the identification and prediction of complex, nonlinear and chaotic dynamics. The experiment results show that the multi-step CLDNN has better identification and prediction performance than the multi-step DNN. The research provides possible corroboration for developing new deep learning based algorithms for nonlinear system identification.

Highlights

  • Dynamical systems,1 which contain many mathematical models, play a key role in deepening our understanding for complex physical world and improving our ability to forecast the future development track for a given process

  • Inspired by the multi-step deep neural network (DNN) and CLDNN, we propose a new method of nonlinear systems identification that combines convolutional neural network (CNN), long short-term memory (LSTM), DNN and classical linear multi-step method derived from numerical analysis into one unified architecture

  • The Hopf bifurcation occurs at a critical point, and the systems switch from stable state to unstable state with a limit cycle pictured as a closed curve in phase space

Read more

Summary

INTRODUCTION

Dynamical systems, which contain many mathematical models, play a key role in deepening our understanding for complex physical world and improving our ability to forecast the future development track for a given process. There were many white-box approaches including symbolic regression, compressive sensing, sparse regression, to determine the underlying architecture of nonlinear dynamical system derived from data These interpretable models can detect the overall parametric form of the governing equation. Scitation.org/journal/adv with deep neural networks (multi-step DNN), to approximate dynamical systems This is an important innovation in practice as the approach need not to commit to a particular class of basis function by using a richer family of function approximators. The traditional multi-step DNN architecture lacks the capability to capture the temporal dependencies from nonlinear dynamical time-series data. We use CNN to automatically extract abstract features and LSTM to model temporal dependencies in time series data, and DNN to map the function f representing the evolution of dynamical systems.

MODEL ARCHITECTURE
Two-dimensional damped oscillator
Chaotic Lorenz system
Hopf bifurcation
Fluid flow behind a cylinder
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call