Abstract

The hierarchical control of the DC microgrid regulates the terminal voltages of the interfacing converter to achieve proportional load sharing and good voltage regulation at the DC bus. In doing so, the difference of the voltage at different nodes increases which results in higher circulating current and leads to higher losses. In this paper, a Reinforcement Learning Based Integrated Control (RLIC) is proposed which will minimize the circulating current and power losses in the transmission. The proposed RLIC consists of a primary and secondary controller. The primary controller is a robust sliding mode controller which receives the voltage references from secondary controller and regulates the terminal voltage and source current accordingly. The secondary control consists of a proportional integral control (PI), and a Deep Neural Network (DNN) surrogate model with implementation of Q-Learning as reinforcement method. A novel DNN based surrogate model uses the droop value from the PI controller and estimates the power loss and the local and global loading difference for a particular node, for a set of operating condition. This surrogate model is used by the Q-Learning based reinforcement technique which adjusts the droop constants and provide the voltage reference to primary controller to maintain load sharing and reduce the power losses, there-by leading to an improved overall efficiency. The proposed control structure is verified to improve efficiency while maintaining the load sharing and bus voltage regulation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call