Abstract

In the residential multicarrier energy system (RMES), autonomous energy management (AEM) is beneficial for prosumers' costs that actively controls generation, energy conversion, and storage in real-time. Conventional model-based AEM methods rely on forecast models of distributed energy resources and proper system parameters, which are difficult for practical application. This article proposes a novel model-free two-timescale real-time AEM strategy for the RMES. The energy management problem in the RMES is time-decomposed into two timescales (hourly-ahead external energy trading and 15-min ahead internal energy conversion), formulated as Markov Games. Then, defining a multi-agent system not only enables separate energy management diagrams but also accelerates learning. Intelligent agents account for optimizing energy trading and energy conversion to minimize daily costs under the training of a deep deterministic policy gradient algorithm. In order to learn a collaborative strategy, all agents are trained centrally while being executed based on local information in a decentralized manner for fast response. A deterministic study indicates that the proposed AEM strategy can effectively and flexibly schedule the operation of components with respect to different price signals and load profiles. In the stochastic study that considers the error thresholds of solar panels generation and loads, the difference between the energy cost of test scenarios with a trained strategy and the no-regret learning method is equal to 0.32%, which is appropriate. In addition, the proposed method not only achieves a reduction in power imbalance but also exhibits lower energy costs when compared to various benchmark methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call