Abstract

Over the last several years many utilities, as part of a smart grid program, have installed systems to perform voltage reduction or conservation voltage reduction (also known as Integrated Volt VAr Control). The goal of these programs is to reduce load and therefore generation in order to reduce carbon gases. The basic concept is to lower the voltage across the distribution system in order to lower the demand but still have the voltage within ANSI ranges for every customer. ANSI ranges require that the utility provide a voltage at the point of common coupling (typically the revenue meter) to be within 5 percent of the nominal voltage (either above or below). A typical 120 VAC connection is therefore allowed to vary between 114 and 126 VAC. The ratio of load reduction to voltage reduction is referred to as the CVR factor and most utilities strive for a CVR factor of 0.8-1.0 but are typically obtaining actual CVR factors of 0.6-0.8. This is making it harder to cost justify system wide deployments of the technology. In order to keep all voltages within these limits at all times utilities typically deploy LTC transformers, bus regulators or feeder regulators at the source and the use line regulators along with fixed and switched capacitors downstream to maintain the voltage across the entire feeder circuit. This paper explores the coordination of these devices by assigning each a zone and then points out that there are currently deficiencies in the system as there is a zone that is currently not being supported. By adding regulation to the missing zone, additional voltage reduction can be obtained, increasing the CVR factor.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call