Abstract
Object tracking has gained importance in various applications especially in traffic monitoring, surveillance and security, people tracking, etc. Previous methods of multiobject tracking (MOT) carry out detections and perform object tracking. Although not optimal, these frameworks perform the detection and association of objects with feature extraction separately. In this article, we have proposed a Super Chained Tracker (SCT) model, which is convenient and online and provides better results when compared with existing MOT methods. The proposed model comprises subtasks, object detection, feature manipulation, and using representation learning into one end-to-end solution. It takes adjacent frames as input, converting each frame into bounding boxes’ pairs and chaining them up with Intersection over Union (IoU), Kalman filtering, and bipartite matching. Attention is made by object attention, which is in paired box regression branch, caused by the module of object detection, and a module of ID verification creates identity attention. The detections from these branches are linked together by IoU matching, Kalman filtering, and bipartite matching. This makes our SCT speedy, simple, and effective enough to achieve a Multiobject Tracking Accuracy (MOTA) of 68.4% and Identity F1 (IDF1) of 64.3% on the MOT16 dataset. We have studied existing tracking techniques and analyzed their performance in this work. We have achieved more qualitative and quantitative tracking results than other existing techniques with relatively improved margins.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.