Abstract

Rough pavements cause ride discomfort and energy inefficiency for road vehicles. Existing methods to address these problems are time-consuming and not adaptive to changing driving conditions on rough pavements. With the development of sensor and communication technologies, crowdsourced road and dynamic traffic information become available for enhancing driving performance, particularly addressing the discomfort and inefficiency issues by controlling driving speeds. This study proposes a speed control framework on rough pavements, envisioning the operation of autonomous vehicles based on the crowdsourced data. We suggest the concept of ‘maximum comfortable speed’ for representing the vertical ride comfort of oncoming roads. A deep reinforcement learning (DRL) algorithm is designed to learn comfortable and energy-efficient speed control strategies. The DRL-based speed control model is trained using real-world rough pavement data in Shanghai, China. The experimental results show that the vertical ride comfort, energy efficiency, and computation efficiency increase by 8.22%, 24.37%, and 94.38%, respectively, compared to an optimization-based speed control model. The results indicate that the proposed framework is effective for real-time speed controls of autonomous vehicles on rough pavements.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call