Abstract

Fast charging stations (FCSs) can reduce the charging time of electric vehicles (EVs) and thus can help in the widespread adoption of EVs. However, FCSs may result in the power system overload. Therefore, the deployment of the battery energy storage system (BESS) in FCSs is considered as a potential solution to avoid system overload. However, the optimal operation of FCSs equipped with BESS is challenging due to the involvement of several uncertainties, such as EV arrival/departure times and electricity prices. Therefore, in this study, a deep reinforcement learning-based method is proposed to operate FCSs with BESS under these uncertainties. The state-of-the-art soft actor-critic method (SAC) is adopted and the model is trained with one-year data to cover seasonality and different types of days (working days and holidays). The performance of SAC is compared with two other deep reinforcement learning methods, i.e., deep deterministic policy gradient and twin delayed deep deterministic policy gradient. A comprehensive reward function is devised to train the model offline, which can then be used for the real-time operation of FCS with BESS under different uncertainties. The trained model has successfully reduced the peak load of the FCS during both weekdays and holidays by optimizing the operation of the BESS. In addition, the robustness of the proposed model against different EV arrival scenarios and extreme market price scenarios is also evaluated. Simulation results have shown that the proposed model can reduce the peak load of the FCS under diverse conditions in the desired fashion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call