Abstract

A hybrid discretization scheme that combines the virtues of the Taylor series and Matrix exponential integration methods is proposed. In the algorithm, each sampling time interval is divided into two subintervals to be considered according to the time delay and sampling period. The algorithm is not too expensive computationally and lends itself to be easily inserted into large simulation packages. The mathematical structure of the new discretization scheme is explored and described in detail. The performance of the proposed discretization procedure is evaluated by employing case studies. Various input signals, sampling rates, and time-delay values are considered to test the proposed method. The results demonstrate that the proposed discretization scheme is better than previous Taylor series method for nonlinear time-delay systems, especially when a large sampling period is inevitable.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call