Abstract
The localization of the target object for data retrieval is a key issue in the Intelligent and Connected Transportation Systems (ICTS). However, due to the lack of intelligence in the traditional transportation system, it takes a lot of resources to manually retrieve and locate the queried objects from a large number of images. In order to solve this issue, we propose an effective method for query-based object localization, which uses artificial intelligence techniques to automatically locate the queried object in complex backgrounds. The proposed method is termed as Fine-grained and Progressive Attention Localization Network (FPAN), which uses an image and a queried object as input to accurately locate the target object in the image. Specifically, the fine-grained attention module is naturally embedded into each layer of a convolution neural network (CNN), thereby gradually suppressing the regions that are irrelevant to the queried object and eventually focusing attention on the target area. We further employ top-down attentions fusion algorithm operated by a learnable cascade up-sampling structure to establish the connection between the attention map and the exact location of the queried object in the original image. Furthermore, the FPAN is trained by multi-task learning with box segmentation loss and cosine loss. At last, we conduct comprehensive experiments on both queried-based digit localization and object tracking with synthetic and benchmark datasets. The experimental results show that our algorithm is far superior than other algorithms on the synthesis datasets and outperforms most existing trackers on the OTB and VOT datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.