Abstract

This paper presents a power distribution system that prioritizes the reliability of power to critical loads within a community. The proposed system utilizes reinforcement learning methods (Q-learning) to train multi-port power electronic interface (MPEI) systems within a community of microgrids. The primary contributions of this article are to present a system where Q-learning is successfully integrated with MPEI to reduce the impact of power contingencies on critical loads and to explore the effectiveness of the subsequent system. The feasibility of the proposed method has been proven through simulation and experiments. It has been demonstrated that the proposed method can effectively improve the reliability of the local power system—for a case study where 20% of the total loads are classified as critical loads, the system average interruption duration index (SAIDI) has been improved by 75% compared to traditional microgrids with no load schedule.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call