Abstract
Accurate spatial-temporal traffic modeling and prediction play an important role in intelligent transportation systems (ITS). Recently, various deep learning methods such as graph convolutional networks (GCNs) and recurrent neural networks (RNNs) have been widely adopted in traffic prediction tasks to extract spatial-temporal dependencies based on a large volume of high-quality training data. However, there exist data scarcity problems in some transportation networks, and in these cases, the performance of traditional GCNs and RNNs based approaches will degrade sharply. To address this problem, this paper proposes an adversarial domain adaptation with spatial-temporal graph convolutional network (Ada-STGCN) model to predict traffic indicators for a data-scarce target road network by transferring the knowledge from a data-sufficient source road network. Specifically, Ada-STGCN first develops a spatial-temporal graph convolutional network that combines the GCN and gated recurrent unit (GRU) to extract spatial-temporal dependencies from source and target road networks. Then, the technique of adversarial domain adaptation is integrated with the spatial-temporal graph convolutional network to learn discriminative and domain-invariant features to facilitate knowledge transfer. Experimental results on the real-world traffic datasets in the traffic flow prediction task demonstrate that our model yields the best prediction performance compared to state-of-the-art baseline methods.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Intelligent Transportation Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.