Abstract

While time delays typically lead to poor control performance, and even instability, previous research shows that time delays can, in some cases, be beneficial. This paper presents a new benefit of time-delayed control (TDC) for single-input single-output (SISO) linear time invariant (LTI) systems: it can be used to improve robustness. Time delays can be used to approximate state derivative feedback (SSD), which together with state feedback (SF) can reduce sensitivity and improve stability margins. Additional sensors are not required since the state derivatives are approximated using available measurements and time delays. A systematic design approach, based on solution of delay differential equations (DDEs) using the Lambert W method, is presented using a scalar example. The method is then applied to both single- and two-degree of freedom (DOF) mechanical systems. The simulation results demonstrate excellent performance with improved stability margins.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call