Abstract

This paper is concerned with estimating the parameters of single-input, continuous-time systems from sampled input–output data, where any input time delays may be a fraction of the sampling interval. The proposed method estimates the parameters by minimizing a cost function defined by the sum of squared errors between the predicted and measured outputs using the optimal Refined Instrumental Variable method for Continuous-time models (RIVC). Potential difficulties may be encountered when standard gradient-based optimization is used for solving this optimization problem directly. These arise because the cost function is normally multi-modal with respect to the fractional time delay and they can occur for several reasons, such as aliasing errors, the nature of the system dynamics, and the selection of the excitation signal. In order to avoid the problems caused by these local minima, a two-stage procedure is proposed combining RIVC estimation of the model parameters with a grid search for the fractional time delay. Although this algorithm is not quite as computationally efficient as standard gradient-based optimization, it is simple, robust, and more widely applicable, as illustrated by numerical examples.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.