Abstract

This paper proposes a technique to identify nonlinear dynamical systems with time delay. The sparse optimization algorithm is extended to nonlinear systems with time delay. The proposed algorithm combines cross-validation techniques from machine learning for automatic model selection and an algebraic operation for preprocessing signals to filter the noise and for removing the dependence on initial conditions. We further integrate the bootstrapping resampling technique with the sparse regression to obtain the statistical properties of estimation. We use Taylor expansion to parameterize time delay. The proposed algorithm in this paper is computationally efficient and robust to noise. A nonlinear Duffing oscillator is simulated to demonstrate the efficiency and accuracy of the proposed technique. An experimental example of a nonlinear rotary flexible joint is presented to further validate the proposed method.

Highlights

  • Time delay exists in many engineering, physics, chemistry, biology and economics systems

  • This paper presents a nonparametric identification technique to identify nonlinear dynamic systems and estimate time delay introduced by the feedback control

  • We extend the sparse identification of nonlinear dynamics (SINDy) approach by combining it with the algebraic signal processing method to deal with the issue of measurement noise, initial conditions and derivatives

Read more

Summary

Introduction

Time delay exists in many engineering, physics, chemistry, biology and economics systems. This paper presents a nonparametric identification technique to identify nonlinear dynamic systems and estimate time delay introduced by the feedback control. The work in [20] presents a nonlinear least square-based algorithm in which the instrumental variable method estimates the parameters of the transfer function of the system while an adaptive gradient-based iteration finds the optimal time delay from the filtered irregularly sampled data. To deal with this issue, the authors use a low-pass filter to widen the convergence region around the global minimum Another nonlinear recursive optimization algorithm is proposed in [21], which combines the linear method of Levenberg-Marquardt to compute the plant parameters with a modified Gauss–Newton algorithm to estimate time delays.

Problem definition and assumptions
The proposed method
Algebraic operation
Sparse representation
Cross validation and bootstrapping
Simulated example
Experimental example
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call