Abstract
Retrieving an occluded pedestrian remains a challenging problem in person re-identification (re-id). Most existing methods utilize external detectors to disentangle the visible body parts. However, these methods are unstable due to domain bias and consume numerous computing resources. In this paper, we propose a novel and lightweight Part-based Representation Enhancement (PRE) network for occluded re-id that takes full advantages of the local correlations to aggregate distinctive information for local features without relying on auxiliary detectors. First, according to the information qualities of different body parts, we design a reasonable partition strategy to obtain the local features. Next, a Partial Relationship Aggregation (PRA) module is developed to self-mine the visibility of the body and construct a correlation matrix for collecting the information related to pre-defined classes. Following this, we propose an Inter-part Omnibearing Fusion (IOF) module that leverages the occlusion-suppressed class features to enhance the distinctiveness of the local features via feature completion and reverse fusion strategies. During the testing phase, the global and reconstructed local features are concatenated together for re-id without a complex visible region matching algorithm. Extensive experiments on occluded, partial, and holistic re-id benchmarks demonstrate the superiority of PRE over state-of-the-art methods in terms of accuracy and model complexity.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Circuits and Systems for Video Technology
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.