Abstract

With the increasing popularity of plug-in electric vehicles (PEV), charging infrastructure becomes widely available and offers multiple services to PEV users. Each charging service has a distinct quality of service (QoS) level that matches user expectations. The charging service demand is interdependent, i.e., the demand for one service is often affected by the prices of others. Dynamic pricing of charging services is a coordination mechanism for QoS satisfaction of service classes. In this article, we propose a differentiated pricing mechanism for a multiservice PEV charging infrastructure (EVCI). The proposed framework motivates PEV users to avoid over-utilization of particular service classes. Currently, most of dynamic pricing schemes require full knowledge of the customer-side information; however, such information is stochastic, non-stationary, and expensive to collect at scale. Our proposed pricing mechanism utilizes model-free deep reinforcement learning (RL) to learn and improve automatically without an explicit model of the environment. We formulate our framework to adopt the twin delayed deep deterministic policy gradient (TD3) algorithm. The simulation results demonstrate that the proposed RL-based differentiated pricing scheme can adaptively adjust service pricing for a multiservice EVCI to maximize charging facility utilization while ensuring service quality satisfaction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call