Abstract

Abstract The extensive deployment of surveillance cameras in public places, such as subway stations and shopping malls, necessitates automated visual-data processing approaches to match pedestrians across non-overlapping multiple cameras. However, due to the insufficient number of labeled training samples in real surveillance scene, it is difficult to train an effective deep neural network for cross-camera pedestrian recognition. Moreover, the cross-camera variation in viewpoint, illumination, and background makes the task even more challenging. To address these issues, in this paper we propose to transfer the parameters of a pre-trained network to our target network and then update the parameters adaptively using training samples from the target domain. More importantly, we develop new network structures that are specially tailored for cross-camera pedestrian recognition task, and implement a simple yet effective multi-level feature fusion method that yield more discriminative and robust features for pedestrian recognition. Specifically, rather than conventionally perform classification on the single-level feature of the last feature layer, we instead utilize multi-level feature by associating feature visualization with multi-level feature fusion. As another contribution, we have published our codes and extracted features to facilitate further research. Extensive experiments are conducted on WARD, PRID and MARS datasets, we show that the proposed method consistently outperforms state-of-the-arts.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.