Abstract
Person re-identification(Re-ID) technology has been a research hotspot in intelligent video surveillance, which accurately retrieves specific pedestrians from massive video data. Most research focuses on the short-term scenarios of person Re-ID to deal with general problems, such as occlusion, illumination change, and view variance. The appearance change or similar appearance problem in the long-term scenarios has has not been the focus of past research. This paper proposes a novel Re-ID framework consisting of a two-branch model to fuse the appearance and gait feature to overcome covariate changes. Firstly, we extract the appearance features from a video sequence by ResNet50 and leverage average pooling to aggregate the features. Secondly, we design an improved gait representation to obtain a person’s motion information and exclude the effects of external covariates. Specifically, we accumulate the difference between silhouettes to form an active energy image (AEI) and then mask the mid-body part in the image with the Improved-Sobel-Masking operator to extract the final gait representation called ISMAEI. Thirdly, we combine appearance features with gait features to generate discriminative and robust fused features. Finally, the Euclidean norm is adopted to calculate the distance between probe and gallery samples for person Re-ID. The proposed method is evaluated on the CASIA Gait Database B and TUM-GAID datasets. Compared with state-of-the-art methods, experimental results demonstrate that it can perform better in both Rank-1 and mAP.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.