Abstract

Research on person re-identification(Re-ID) has important value in pedestrian detection, target tracking, criminal investigation, and other related fields. In unsupervised pedestrian recognition algorithms, the accuracy of pseudo-labels is crucial to the recognition results. However, in practical scenarios, low-quality images caused by factors such as differences in camera resolution and shooting angles can affect the extraction of pedestrian features by these algorithms, thereby negatively impacting the accuracy of the labels and the learning process of the model. To address this problem, we propose an image quality enhancement algorithm for unsupervised person Re-ID (IQE). To the best of our knowledge, this study is the first to introduce detail enhancement and the application of low-light enhancement algorithms into unsupervised person Re-ID. By improving the feature extraction quality based on these two aspects, higher-quality pseudo-labels can be constructed. This method improves the accuracy of feature extraction and clustering, thereby increasing the accuracy of pseudo-labels and reducing the interference of noisy pseudo-labels. The experimental results showed that the IQE method outperformed state-of-the-art person Re-ID methods in terms of Rank-1 accuracy and mAP. Specifically, IQE achieved an 87.9% rank-1 accuracy and a 71.2% mAP on the Market-1501 dataset; a 78.1% rank-1 accuracy and a 61.7% mAP On the DukeMTMC-reID dataset; and a 51.1% rank-1 accuracy and 24.2% mAP on the MSMT17 dataset.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.