Abstract
Although unsupervised person re-identification (Re-ID) has drawn increasing research attention, it still faces the challenge of learning discriminative features in the absence of pairwise labels across disjoint camera views. To tackle the issue of label scarcity, researchers have delved into clustering and multilabel learning using memory dictionaries. Although effective in improving unsupervised Re-ID performance, these methods require substantial computational resources and introduce additional training complexity. To address this issue, we propose a conceptually simple yet effective and learnable module effective block, named the meta feature transformer (MFT). MFT is a streamlined, lightweight network architecture that operates without the need for complex networks or feature memory bank storage. It primarily focuses on learning interactions between sample features within small groups using a transformer mechanism in each mini-batch. It then generates a new sample feature for each group through a weighted sum. The main benefits of MFT arise from two aspects: (1) it allows for the use of numerous new samples for training, which significantly expands the feature space and enhances the network’s generalization capabilities; (2) the trainable attention weights highlight the importance of samples, enabling the network to focus on more useful or distinguishable samples. We validate our method on two popular large-scale Re-ID benchmarks, where extensive evaluations show that our MFT outperforms previous methods and significantly improves Re-ID performances.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.