Electric vehicles have gained widespread recognition as an environmentally conscious mode of transportation, primarily due to significant technological advancements and reduced emissions. When seamlessly integrated into the smart grid infrastructure, these vehicles can serve as flexible energy consumers and potential energy storage units. However, the efficient management of charging stations due to dynamic load demand & pricing poses a significant challenge, aiming to reduce wait times for electric vehicle owners and reduce electricity costs. To overcome this challenge, this study proposes a solution based on dynamic pricing policy and the adaptive Markov decision process. The approach utilizes a reinforcement learning-enabled enhanced multi-agent neural network (EMANN) to optimize charging schedules. This adaptive Markov decision process achieved optimal scheduling by minimizing charging costs during off-peak hours while completing charging in the shortest possible time. Furthermore, a constrained policy is employed to train the enhanced neural network using reinforcement learning, allowing the network to learn and solve adaptive Markov decision processes directly. To evaluate the effectiveness of this solution, the study conducted a numerical experiment using real-time data from Tesla electric vehicles. Combined with queuing models, the EMANN yielded impressive results, including a peak savings of 19.05% and an average profit of $376.1 per kilowatt-hour, achieved within a network convergence time of 512 s. The proposed algorithm for scheduling electric vehicles has been proven to be efficient in addressing the complexities associated with managing electric vehicle charging stations.
Read full abstract