Abstract
The proliferation of trajectory data has facilitated various applications in urban spaces, such as travel time estimation, traffic monitoring, and flow prediction. These applications require a substantial volume of high-quality trajectories as the prerequisite to achieve effective performance. Unfortunately, a large number of real-world trajectories are inevitably collected in unsatisfactory quality due to device constraints. To address this issue, previous studies have proposed numerous trajectory recovery methods to augment the quality of such trajectories, thereby ensuring the performance of related applications. However, these methods all assume the awareness of the recovery positions in advance, which is a condition not always available in practice. In this paper, we discard this strong assumption and focus on trajectory recovery with irregular time intervals as a more prevalent setting in downstream scenarios. We propose a novel framework, called TERI, to tackle trajectory recovery without prior information in a two-stage process, where recovery positions are first detected, followed by the imputation of the missing data points. In each stage, TERI framework deploys a model named RETE, which is based on Transformer encoder architecture enhanced by novel designs to boost the performance for the new problem setting. Specifically, RETE features a learnable Fourier encoding module to better model spatial and temporal correlations, and integrates collective transition pattern learning and trajectory contrastive learning to effectively capture sequential transition patterns. Extensive experiments on three real-world datasets demonstrate that TERI consistently outperforms all the baselines by a significant large margin.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.