Abstract

Pedestrians are often vulnerable users of urban roads and ensuring their safety is a pressing challenge in the filed of intelligent transportation. Multiple pedestrian tracking is one of the key technologies for traffic statistics and abnormal behavior analysis, etc. Detection-based tracking methods have achieved remarkable results and have become mainstream processing schemes. However, target association is still immature and less effective in complex scenarios. In the proposed tracking system, several candidates surrounding each detected pedestrian are selected sparsely, and the associating relationship between the target and these candidates is determined based on a graph attention map. These graph attention maps contain positional correlations of the matching pairs and are applicable with pedestrians’ posture variations. The weighted correlating value is estimated with the positional weighted matrix and merged attention map. The correlating relationship is confirmed with the weighted correlating value and distance matching loss. To enhance the computation efficiency of the graph attention maps for these tracked targets and candidates, feature extraction is processed separately. Convolutional features extracted from one specific middle layer of the backbone network are used to represent each target or candidate. The experiment results show that the proposed tracker achieves better performance than the other five state-of-the-art trackers on three publicly available databases.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.