Abstract

AbstractThe tracking algorithm based on Siamese networks cannot change the corresponding templates according to appearance changes of targets. Therefore, taking convolution as a similarity measure finds it difficult to collect background information and discriminate background interferents similar to templates, showing poor tracking robustness. In view of this problem, a two-stage tracking algorithm based on the similarity measure for fused features of positive and negative samples is proposed. In accordance with positive and negative sample libraries established online, a discriminator based on measurement for fused features of positive and negative samples is learned to quadratically discriminate a candidate box of hard sample frames. The tracking accuracy and success rate of the algorithm in the OTB2015 benchmark dataset separately reach 92.4% and 70.7%. In the VOT2018 dataset, the algorithm improves the accuracy by nearly 0.2%, robustness by 4.0% and expected average overlap (EAO) by 2.0% compared with the benchmark network SiamRPN++. In terms of the LaSOT dataset, the algorithm is superior to all algorithms compared. Compared with the basic network, its success rate increases by nearly 3.0%, and the accuracy rises by more than 1.0%. Conclusions: The experimental results in the OTB2015, VOT2018 and LaSOT datasets show that the proposed method has a great improvement in the tracking success rate and robustness compared with algorithms based on Siamese networks and particularly, it performs excellently in the LaSOT dataset with a long sequence, occlusion and large appearance changes.KeywordsObject trackingSiamese networkQuality evaluationOnline update of sample librarySimilarity measure

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.