Abstract

Diverse and reasonable multi-modal trajectory prediction of pedestrian is vital for the safety of intelligent autonomous systems. Currently, most of existing methods primarily focus on modeling the extrinsic interactions of pedestrians, but neglect to take advantage of the intrinsic properties of pedestrian motion, which may generate unrealistic pedestrian movement. We investigate the natural motion of pedestrians and explore future motion patterns of pedestrians with similar observed trajectories, which provides a new perspective for multi-modal prediction of pedestrian trajectories. On this basis, we propose a novel model to learn diverse trajectories from human motion patterns, which is mainly composed of a motion pattern selector and a multi-modal trajectory generator. With a constructed pattern gallery, the motion pattern selector mines the possible future motion patterns based on the similar observed trajectories. Subsequently, the diverse future motion patterns are fed into the multi-modal trajectory generator, in which these motion patterns are selected and refined through a scoring network and a regression network, and then directly generate multiple diverse trajectories. We use two public pedestrian trajectory ETH/UCY datasets to compare our method with several baseline methods. Experimental results show that our model outperforms most of the state-of-the-art approaches both in accuracy and diversity by fully leveraging the intrinsic information of trajectories.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.