Abstract
Pedestrian trajectory prediction in urban environments has emerged as a critical research area with extensive applications across various domains. Accurate prediction of pedestrian trajectories is essential for the safe navigation of autonomous vehicles and robots in pedestrian-populated environments. Effective prediction models must capture both the spatial interactions among pedestrians and the temporal dependencies governing their movements. Existing research primarily focuses on forecasting a single trajectory per pedestrian, limiting its applicability in real-world scenarios characterised by diverse and unpredictable pedestrian behaviours. To address these challenges, this paper introduces the Graph Convolutional Network, Spatial–Temporal Attention, and Generative Model (GSTGM) for pedestrian trajectory prediction. GSTGM employs a spatiotemporal graph convolutional network to effectively capture complex interactions between pedestrians and their environment. Additionally, it integrates a spatial–temporal attention mechanism to prioritise relevant information during the prediction process. By incorporating a time-dependent prior within the latent space and utilising a computationally efficient generative model, GSTGM facilitates the generation of diverse and realistic future trajectories. The effectiveness of GSTGM is validated through experiments on real-world scenario datasets. Compared to the state-of-the-art models on benchmark datasets such as ETH/UCY, GSTGM demonstrates superior performance in accurately predicting multiple potential trajectories for individual pedestrians. This superiority is measured using metrics such as Final Displacement Error (FDE) and Average Displacement Error (ADE). Moreover, GSTGM achieves these results with significantly faster processing speeds.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.