Abstract

Multi-target multi-camera tracking (MTMCT) targets to generate trajectories of the object that appeared under multiple cameras automatically. MTMCT can be treated as a combination of intra-camera tracking and cross-camera tracking. The existing work only employs the global description to perform the tracklet generating. However, the global description cannot model the local similarity between targets, leading to existing methods not to be robust to occlusion and fast motion. To handle the mentioned problem, we propose an online Optical-based Pose Association (OPA) for multi-target multi-camera tracking. The proposed method utilizes local pose matching to solve the occlusion problem, and applies optical flow to reduce the distance caused by fast motion. For optical-based pose association, we firstly employ OpenPose to generate human pose for each proposal. Then, we utilize the optical flow generated by PWC-Net to adjust the estimated pose for the previous frame. Finally, the modified Object Keypoint Similarity is used to compute the similarity between the pose of the current frame and adjusted pose in the prior frame. Once obtaining the optical-based pose similarity, we combine it with the visual and bounding box spatial similarities to generate the final similarity matrix, and apply the Kuhn-Munkras algorithm for data association. The experiments on the MTMCT and MOT datasets verify the rationality of using human pose information and prove the superiority of the proposed method.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.