Abstract

Vehicle re-identification (ReID) technology has played a more and more important role in Intelligent Transport System (ITS), which aims at searching the same query vehicle identity from a large amount of gallery datasets under different non-overlapping camera views. Current related researches mainly focus on discriminative feature mining of vehicle images and train the model in a fully supervised manner which highly relies on the manual annotations of training data. However, it is labor-consuming and impractical to generate the annotation for each sample image in real-word applications especially for those large-scale transport systems with tons of surveillance data. To this point, we propose in this paper a multi-level progressive learning (MLPL) method for unsupervised vehicle ReID, which gives a good performance by only utilizing the unlabeled target domain images. We firstly introduce a multi-branch architecture to explore the vehicle representations in different level, which consists of one branch for global feature and two branches for local feature learning. A density-based clustering method is employed to generate pseudo labels. Combining with the unique model, we propose a novel re-clustering method to better mine the labels with high reliability. Then a dynamic progressive contrast learning (DPCL) strategy is carefully designed to train the network based on these clustered labels. DPCL could dynamically adjust the training process to maximally strengthen the multi-level feature learning. Moreover, we further propose a self-adaptive loss balance method to automatically compute the weights of different losses during each training iteration. Comprehensive experiments are conducted on several mainstream evaluation datasets, including VeRi776, VehicleID and CityFlowV2-ReID. Compared to other existed unsupervised methods, our approach achieves the new state-of-the-art performance.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.