Abstract

The presence of crowds, crossing people, occlusions, and individuals entering and leaving the monitored scenario turns the automatization of Multi-Object Tracking into a demanding task. Due to the difficulties in dealing with those situations, the data association between the incoming observations and their corresponding identities could produce split, merged, and even missed tracks. This article proposes a Hierarchical Generator of Tracking Global Hypotheses (HGTGH) to prevent those errors. In this method, the data association process is divided into hierarchical levels according to multiple factors, such as the duration of tracking on the individuals or the number of frames in a row where they have been missed. A dedicated formulation of the association cost at each level properly combines various affinity metrics. Instead of generating hypotheses for each individual and analyzing them through a batch of future frames, the proposed method immediately generates a global hypothesis that describes the assignment of a whole set of identities on every incoming frame. The generated hypothesis is also able to render new people entering the scene. Thanks to this advantage, the proposed method simultaneously addresses the reduction of identity switches and the problem of starting new tracks. This novel data association method constitutes the core of an online tracking algorithm, which has been evaluated over the MOT17 dataset to demonstrate its effectiveness.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.