Abstract

Distributed Energy Resources (DERs) installed on low-voltage distribution systems, both utility-owned and nonutility owned may be employed to make improvements to system resiliency and power quality, in addition to simply serving load demand. By selectively dispatching active and reactive power from these existing DERs, improvements in voltage profiles and service availability during adverse events may be achieved. Further benefits can be gained by optimally choosing which DERs to dispatch power from, and the distribution of power from those DERs. For a given operating condition, using Linear Programming methods, optimal power dispatch values for individual DERs can be determined, while considering variables such as voltage gains, losses in the lines and the DERs, and DER reserve capacities. One such optimization technique is developed in this paper, and results from a real-time simulation study on a modified IEEE 13-bus distribution system with three DERs, are being presented. The operation of the optimized DER selection and dispatch algorithm is shown during normal operation of the transmission system when the DERs operate in grid-connected mode. Similar analyses can drive decision making during transmission system failure when the DERs operate in grid-forming or islanded mode.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.