Abstract

Visible-infrared person re-identification aims to match pedestrian images between visible and infrared modalities, and its two main challenges are intra-modality differences and cross-modality differences between visible and infrared images. To address these issues, many advanced methods attempt to design new network structures to extract modality-sharing features, mitigate modality differences, or learn part-level features to overcome background interference. However, they ignore the parameter sharing of the convolutional layers to obtain more modality-sharing features. At the same time, only using part-level features lack discriminative pedestrian representations such as body structure and contours. To handle these problems, a parameter sharing and feature learning network is proposed in this paper to mitigate modality differences and further enhance feature discrimination. Firstly, a new two-stream parameter sharing network is proposed, by sharing the convolutional layers parameters to obtain more modality-sharing features. Secondly, a multi-granularity feature learning module is designed to reduce modality differences at both coarse and fine-grained levels while further enhancing feature discriminability. In addition, a center alignment loss is proposed to learn relationships between identities and to reduce modality differences by clustering features into their centers. For the part-level feature learning, the hetero-center triplet loss is adopted to alleviate the strict constraints of triplet loss. Finally, extensive experiments are conducted to validate our method outperforms state-of-the-art methods on two challenging datasets. In the SYSU-MM01 dataset, the Rank-1 and mAP reach 74.0%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$74.0\\%$$\\end{document} and 70.51%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$70.51\\%$$\\end{document} in the all-search mode, which is an increase of 3.4%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$3.4\\%$$\\end{document} and 3.61%\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{mathrsfs} \\usepackage{upgreek} \\setlength{\\oddsidemargin}{-69pt} \\begin{document}$$3.61\\%$$\\end{document} to baseline, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call