Abstract

This paper proposes a multi-pedestrian tracking method which can effectively track multiple pedestrians in complex and occluded environments. Pedestrian similarity metrics are calculated utilizing a hierarchical Siamese Neural Network (SNN) to provide similarity scores for candidate pedestrian image regions throughout a time-series of images. We calculate inter-frame pedestrian motion affinity using a difference method. This is merged with the similarity metrics to form a Pedestrian Correlation (PeC) term. The PeC results are processed by the principle of maximum correlation, to achieve pedestrian classification and obtain an identity category for each of several pedestrians. This categorization enables each pedestrian to be associated with different tracklets. A second SNN is then used to calculate similarity metrics for sets of small tracklets in a sliding temporal window. Additionally, tracklet motion affinities, based on mean tracklet velocity, are merged with the tracklet similarity metrics to form a Tracklet Correlation (TrC) term. Finally, the TrC results are also processed by a pedestrian classification method, which categories each tracklet by assigning it to a particular pedestrian. All tracklets for each pedestrian category are then merged to create an overall trajectory for each individual pedestrian. We evaluate our method on two publicly available multi-pedestrian tracking datasets, PETS2009- S2L1 and Town Center, which contain complex environments and pedestrian occlusions. Our method significantly outperforms eight other state-of-the-art trackers from the literature, in terms of tracking accuracy on these datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.