Abstract
While time delays typically lead to poor control performance, and even instability, previous research shows that time delays can, in some cases, be beneficial. This paper presents a new benefit of time-delayed control (TDC) for single-input single-output (SISO) linear time invariant (LTI) systems: it can be used to improve robustness. Time delays can be used to approximate state derivative feedback (SSD), which together with state feedback (SF) can reduce sensitivity and improve stability margins. Additional sensors are not required since the state derivatives are approximated using available measurements and time delays. A systematic design approach, based on solution of delay differential equations (DDEs) using the Lambert W method, is presented using a scalar example. The method is then applied to both single- and two-degree of freedom (DOF) mechanical systems. The simulation results demonstrate excellent performance with improved stability margins.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Journal of Dynamic Systems, Measurement, and Control
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.