Abstract

Despite considerable progress in the field of automatic multi-target tracking, several problems such as data association remained challenging. On the other hand, cognitive studies have reported that humans can robustly track several objects simultaneously. Such circumstances happen regularly in daily life, and humans have evolved to handle the associated problems. Accordingly, using brain-inspired processing principles may contribute to significantly increase the performance of automatic systems able to follow the trajectories of multiple objects. In this paper, we propose a multiple-object tracking algorithm based on dynamic neural field theory which has been proven to provide neuro-plausible processing mechanisms for cognitive functions of the brain. We define several input neural fields responsible for representing previous location and orientation information as well as instantaneous linear and angular speed of the objects in successive video frames. Image processing techniques are applied to extract the critical object features including target location and orientation. Two prediction fields anticipate the objects’ locations and orientations in the upcoming frame after receiving excitatory and inhibitory inputs from the input fields in a feed-forward architecture. This information is used in the data association and labeling process. We tested the proposed algorithm on a zebrafish larvae segmentation and tracking dataset and an ant-tracking dataset containing non-rigid objects with spiky movements and frequently occurring occlusions. The results showed a significant improvement in tracking metrics compared to state-of-the-art algorithms.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.