Abstract

Profit maximization of electric vehicle charging station (EVCS) operation yields an increasing investment for the deployment of EVCSs, thereby increasing the penetration of electric vehicles (EVs) and supporting high-quality charging service to EV users. However, existing model-based approaches for profit maximization of EVCSs may exhibit poor performance owing to the underutilization of massive data and inaccurate modeling of EVCS operation in a dynamic environment. Furthermore, the existing approaches can be vulnerable to adversaries that abuse private EVCS operation data for malicious purposes. To resolve these limitations, we propose a privacy-preserving distributed deep reinforcement learning (DRL) framework that maximizes the profits of multiple smart EVCSs integrated with photovoltaic and energy storage systems under a dynamic pricing strategy. In the proposed framework, DRL agents using the soft actor–critic method determine the schedules of the profitable selling price and charging/discharging energy for EVCSs. To preserve the privacy of EVCS operation data, a federated reinforcement learning method is adopted in which only the local and global neural network models of the DRL agents are exchanged between the DRL agents at the EVCSs and the global agent at the central server without sharing EVCS data. Numerical examples demonstrate the effectiveness of the proposed approach in terms of convergence of the training curve for the DRL agent, adaptive profitable selling price, energy charging and discharging, sensitivity of the selling price factor, and varying weather conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call