Abstract

There is a growing interest in the optimization of vehicle fleets management in urban environments. However, limited attention has been paid to the integrated optimization of electric taxi fleets accounting for different operations as well as complex spatiotemporal demand dynamics. To this end, this study develops a real-time recommendation framework based on deep reinforcement learning (DRL) for electric taxis (E-taxis) to improve their system performance with explicit modeling of multiple vehicle actions and varying travel demand across space and over time. Spatiotemporal patterns of urban taxi travels are extracted from large-scale taxi trajectories. Spatiotemporal strategies are proposed to coordinate E-taxis’ repositioning and recharging with optimized recommendation for next destinations and charging stations. A spatiotemporal double deep Q-network (ST-DDQN) is embedded in the DRL framework to maximize the daily profit. A prototype real-time recommendation system for E-taxis is implemented for the decision-making of E-taxi drivers and sensitivity analyses are carried out. The experimental results in Shenzhen, China suggest that the proposed framework could improve the overall performance. This study will benefit the promotion of connected E-taxis and the development of clean and smart transportation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call