Abstract
This paper considers the robust identification for dual-rate input nonlinear equation-error systems with outliers and random time delay. To suppress the negative influence caused by the outliers to the accuracy of identification, the distribution of the noise is represented by a t-distribution rather than a Gaussian distribution. A random time delay is considered in the dual-rate input nonlinear systems. By treating the unknown time delay as the latent variable, the expectation maximization algorithm is derived for identifying the systems. Two numerical simulation examples demonstrate that the proposed algorithm can generate accurate identification results when the measurements are contaminated by the outliers.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have