Abstract

Nowadays, energy management assumes a pivotal role in ensuring the stable operation of microgrids, facilitating enhanced monitoring, analysis, and optimization of energy utilization for end-users. The implementation of an energy management system (EMS) offers invaluable assistance to microgrid users, enabling them to cut electricity costs, ensure reliability, and provide an elevated level of comfort. However, as the scope of microgrid management proliferates, balancing between the comfort of community microgrid users with more devices and their profitability becomes increasingly challenging. To address this issue, this paper proposes a novel community control approach for EMS, called CuEMS, that utilizes deep reinforcement learning (DRL) to manage multiple appliances and energy storage systems (ESSs). The proposed approach optimizes energy profitability for operators while taking into account user comfort. It reformulates energy scheduling as a Markov decision process and learns the optimal scheduling strategy through deep Q-networks. The presented CuEMS approach is evaluated using real world data from Finland, and the results demonstrate that it can optimize the operation of multiple appliances and improve the profitability and user comfort compared to other methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call