Abstract

Wide-area control has been suggested to enhance the dynamic performance of interconnected power systems. However, time delay caused by a wide area of signal transmission may be detrimental to the performance of a damping controller for interarea oscillations. Based on Pade approximation, the investigators of the present work convert time delay into a state-space expression and build the linear model of a closed-loop power system with time delay and subsequently build the relationship model that describes the effects of time delay on the critical modes of a power system. Jacobi-Davidson method is employed to study the influence of time delay on the interarea, local, and time delay control modes. Simulation results illustrate the validity of this method in resolving the influence law of time delay and provide theoretical evidence for an interarea damping controller design for a large-scale power system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call