Abstract

Integration of self-driving functions in electric vehicles is radically changing the transportation systems, and representing an opportunity for power utilities to develop innovative solutions for harnessing the spatio-temporal charging flexibility of autonomous electric vehicles (AEVs). This paper develops a multi-agent reinforcement learning model for intelligent real-time coordinated operation of interdependent autonomous electric ride-hailing system (AERS) and power distribution system (PDS), which enables the coordinated routing and charging of AEVs while ensuring the quality of service by meeting spatio-temporal passenger demand and regularly charging the batteries. The proposed model adopts a modified deep Q-network method with multi-agent rollout to determine the near-optimal solution for the coordinated operation problem that determines the routing and charging of numerous AEVs in real-time. The proposed model is tested on a 13-node transportation network, adopted from the Salt Lake City transportation system, and the IEEE 33-bus test power distribution system to showcase the efficiency of the proposed model in determining the real-time routing and charging decisions for AEVs while meeting the operational constraints of both AERS and PDS.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.