Abstract

Visible thermal person re-identification (VTReID) is a cross-modal retrieval problem. Which is a huge challenge to match pedestrians due to the large modality differences between visible and thermal images. Recent works have shown that using local features describing body parts as well as global features of the personal image itself, can give robust feature representations, even in the case of missing body parts. However, the direct use of individual local-level features, without considering the relationship between the parts of the body part, confuses the ability to distinguish the identity of different people with similar attributes in the corresponding part. To address this issue, we propose a relation network design for visible thermal person re-identification(RN-VTReID). Besides, we used a combination of Global Average Pooling (GAP) and Global Max Pooling (GMP) to obtain the background and texture features of pedestrians. The experimental results show that the RN-VTReID model outperforms state-of-the-art techniques on the SYSU-MM01 and RegDB datasets, which demonstrates the effectiveness of the method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call