Abstract

Traffic estimation is imperative for conducting fundamental transportation engineering tasks such as transportation planning and traffic safety studies. Additionally, traffic prediction is vital for many data-driven intelligent transportation system applications. Most traffic estimation and prediction methods rely on infrastructure-based sensors to collect traffic parameters. However, infrastructure-based data collection can be costly and time consuming to set up and maintain. Additionally, the spatial distribution of the collected traffic data is limited by the location of the deployed hardware sensors. Probe vehicle data can be used to overcome these limitations. Traffic modeling for estimation and prediction is a complex task due to the stochastic nonlinear spatiotemporal dependencies exhibited by traffic parameters. In this paper, a deep learning-based sequence-to-sequence architecture called Seq2seq GCN-LSTM was proposed to estimate and predict network-wide traffic volume and speed. The proposed method utilizes short-term historical traffic data collected from a low-penetration rate probe vehicle fleet to estimate and predict traffic parameters up to 60-minutes ahead. The method utilizes Graph Convolutional Networks for spatial dependency extraction and Long Short-Term Memory networks to model temporal dependencies. The proposed method generated superior traffic results compared to the baseline models. Additionally, the model demonstrated robustness against perturbations caused by the low rate. Furthermore, the probe vehicle penetration rate was varied to test its effect on the proposed method’s modeling capability. The model was able to maintain traffic volume and speed estimation and prediction performance within a reasonable margin of error using penetration rates as low as 1.5% and 0.5%, respectively.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.