Abstract

The integration of electric vehicles (EVs) into vehicle-to-grid (V2G) scheduling offers a promising opportunity to enhance the profitability of multi-energy microgrid operators (MMOs). MMOs aim to maximize their total profits by coordinating V2G scheduling and multi-energy flexible loads of end-users while adhering to operational constraints. However, scheduling V2G strategies online poses challenges due to uncertainties such as electricity prices and EV arrival/departure patterns. To address this, we propose an online V2G scheduling framework based on deep reinforcement learning (DRL) to optimize EV battery utilization in microgrids with different energy sources. Firstly, our approach proposes an online scheduling model that integrates the management of V2G and multi-energy flexible demands, modeled as a Markov Decision Process (MDP) with an unknown transition. Secondly, a DRL-based Soft Actor-Critic (SAC) algorithm is utilized to efficiently train neural networks and dynamically schedule EV charging and discharging activities in response to real-time grid conditions and energy demand patterns. Extensive simulations are conducted in case studies to testify to the effectiveness of our proposed approach. The overall results validate the efficacy of the DRL-based online V2G scheduling framework, highlighting its potential to drive profitability and sustainability in multi-energy microgrid operations.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.