Abstract

Recent years person re-identification (ReID) has been developed rapidly due to its broad practical applications. Most existing benchmarks assume that the same person wears the same clothes across captured images, while, in real-world scenarios, person may change his/her clothes frequently. Thus the Clothes-Changing person ReID (CC-ReID) problem is introduced and several related benchmarks are established. CC-ReID is a very difficult task as the main visual characteristics of a human body, clothes, are different between query and gallery, and clothes-irrelevant features are relatively weak. To promote the research and applications of person ReID in clothes-changing scenarios, in this paper, we introduce a new task called Clothes Template based Clothes-Changing person ReID (CTCC-ReID), where the query image is enhanced by a clothes template which shares similar visual patterns with the clothes of the target person image in the gallery. So, ReID methods are encouraged to jointly consider the original query image and the given clothes template for retrieval in the proposed CTCC-ReID setting. To facilitate research works on CTCC-ReID, we construct a novel large-scale ReID dataset named ClOthes ChAnging person Set Plus (COCAS+), which contains both realistic and synthetic clothes-changing person images with manually collected clothes templates. Furthermore, we propose a novel Dual-Attention Biometric-Clothes Transfusion Network (DualBCT-Net) for CTCC-ReID, which can effectively learn to extract biometric features from the original query person image and clothes features from the given clothes template and then fuse them through a Dual-Attention Fusion Module. Extensive experimental results show that the proposed CTCC-ReID setting and COCAS+ dataset can help greatly push the performance of clothes-changing ReID toward practical applications, and synthetic data is impressively effective for CTCC-ReID. What’s more, the proposed DualBCT-Net shows significant improvements over state-of-the-art methods on the CTCC-ReID task. COCAS+ and code of DualBCT-Net will be released in <uri xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">https://github.com/Chenhaobin/COCAS-plus</uri> .

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call