Abstract

In the near future, microgrids will become more prevalent as they play a critical role in integrating distributed renewable energy resources into the main grid. Nevertheless, renewable energy sources, such as solar and wind energy can be extremely volatile as they are weather dependent. These resources coupled with demand can lead to random variations on both the generation and load sides, thus complicating optimal energy management. In this article, a reinforcement learning approach has been proposed to deal with this non-stationary scenario, in which the energy management system (EMS) is modelled as a Markov decision process (MDP). A novel modification of the control problem has been presented that improves the use of energy stored in the battery such that the dynamic demand is not subjected to future high grid tariffs. A comprehensive reward function has also been developed which decreases infeasible action explorations thus improving the performance of the data-driven technique. A Q-learning algorithm is then proposed to minimize the operational cost of the microgrid under unknown future information. To assess the performance of the proposed EMS, a comparison study between a trading EMS model and a non-trading case is performed using a typical commercial load curve and PV profile over a 24-h horizon. Numerical simulation results indicate that the agent learns to select an optimized energy schedule that minimizes energy cost (cost of power purchased from the utility and battery wear cost) in all the studied cases. However, comparing the non-trading EMS to the trading EMS model operational costs, the latter one was found to decrease costs by 4.033% in summer season and 2.199% in winter season.

Highlights

  • Increasing interest in renewable energy sources has led to massive deployment of microgrids as they offer a scalable way of integrating renewable sources into the main grid while allowing maximum usage of battery energy storage system

  • The are set as 0 ≤ Pg p (t) ≤ Pmax gp gs g p, 0 ≤ Pgs ( t ) ≤ Pgs microgrid owner and the distribution system operator (DSO) have a contract that governs the maximum power that can be exchanged between the microgrid and the utility at the point of common coupling (PCC)

  • Analysis of how energy stored in the battery energy storage system (BESS) is used as power Pl,t the energy management system (EMS) seeks to meet net demand is carried out. When it comes to system running cost, charging the battery when tariffs are low and discharging the battery when tariffs are high is important so as to rip some revenue

Read more

Summary

Introduction

Increasing interest in renewable energy sources has led to massive deployment of microgrids as they offer a scalable way of integrating renewable sources into the main grid while allowing maximum usage of battery energy storage system. To minimize the operational costs, a Q-learning based algorithm is implemented to learn the control actions for battery energy storage system (BESS) under very complex environment (e.g., battery degradation, intermittent renewable energy supply and grid tariff uncertainty). The novelty of the paper is presenting the design of an energy storage strategy that focuses on energy consumption optimization by maximizing the use of available PV energy and energy stored in the battery instead of focusing solely on direct storage control In this architecture excess microgrid energy can be sold back to the utility to increase revenue a non-trading algorithm scheme has been studied where constraining rules are embedded into the learning process to curtail excess energy from been sold back to the utility.

Energy Management System Problem Formulation
Objective
Utility Grid Model
Markov Decision Framework as Applied to EMS Formulation
State and State Space Formulation
Action and Action Space Formulation
Reward Function Formulation
Q-Learning Algorithm for Energy Management Problem
Simulation Setup
Simulation
Summer Solar PV and Grid Tariff Profile
Reward Convergence during Summer
Results forpresents
Operational Cost during Summer
Winter Solar PV and Grid Tariff Profile
Training
Results
Operational Cost during Winter
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call