Abstract
Multi-Camera Vehicle Re-identification and Tracking (MCVRT) is a challenging task that involves identifying and tracking vehicles across multiple camera views in a surveillance network. Multi-Target Multi-Camera Tracking (MTMCT) and vehicle Re-Identification (Re-ID) are two major technologies applied to MCVRT tasks. Variations in aspect ratio, occlusion, orientation, and lighting conditions make vehicle re-identification and multi-camera tracking challenging. While some existing methods address these problems, it remains a significant challenge in the field. Additionally, most Re-ID datasets only include images captured in well-lit environments, and the impact of dark images on the performance of existing models needs to be explored further. This paper presents a new approach to address the challenge of low-light images in vehicle re-identification and achieves state-of-the-art results on public datasets. Our approach is based on two key components: (i) an Adaptive Low-light correction and Self-Attention module (ALSA) for image pre-processing in Vehicle Re-ID networks, and (ii) a new loss function called Log Triplet Loss (LT-Loss). We evaluated the presented approach through computer simulations on the VeRi-776 dataset, and the results showed that our model achieved a Rank@1 accuracy of 98.99%, and also outperformed commonly used models on dark images. Our study highlights the importance of considering lighting conditions in vehicle re-identification and provides a new approach to address this challenge.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.