Abstract

With the development of mobility on-demand and transportation electrification technologies, electric vehicle (EV)-based ride-hailing fleets are playing an increasingly important role in the urban ground transportation system. Due to the stochastic nature of order request arrival and electricity price, there exists decision-making risks for ride-hailing EVs operated in order grabbing mode. It is important to investigate their risk-aware operation and model their impact on fleet charging demand and trajectory. In this paper, we propose a distributional reinforcement learning framework to model the risk-aware operation of ride-hailing EVs in order grabbing mode. First, we develop a risk quantification scheme based on the dual theory of choices under risk. Then, we combine Implicit Quantile Network, distorted quantile sampling, and distributional temporal difference learning methods to capture the intrinsic uncertainties and depict the risk-aware EV operation decisions. The proposed framework can provide a more accurate spatial-temporal portrayal of the charging demand and fleet management results. The real-world data from Haikou city is used to illustrate and verify the effectiveness of the proposed scheme.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call