Abstract
In the domain of person re-identification, conventional approaches are often susceptible to interference from shifts of variables such as background, veils, and attire, leading to a degradation in identification accuracy. To mitigate the influences of these shifts on identification performance, this paper introduces a novel person re-identification approach that leverages cycle-consistent generative adversarial networks coupled with multiple feature fusion. This method achieves re-identification by assessing and comparing the measured distances between pairs of persons. The approach is bifurcated into two streams: one for extracting global features and another for capturing local features. Subsequently, these global and local features are integrated. The integrated features are then subjected to contrast metric distance learning, where similarity scores are computed to rank the samples. Extensive experimental outcomes on substantial datasets such as CUHK03 and VIPER demonstrate that this innovative method effectively diminishes the influence of background, veils, attire, and other alterations on identification accuracy.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have