Abstract

Person re-identification (re-ID) technology has attracted extensive interests in critical applications of daily lives, such as autonomous surveillance systems and intelligent control. However, light-weight and efficient person re-ID solutions are rare because the limited computing resources cannot guarantee accuracy and efficiency in detecting person features, which inevitably results in performance bottleneck in real-time applications. Aiming at this research challenge, this study developed a lightweight framework for generation of the person multi-attribute feature. The framework mainly consists of three sub-networks each conforming to a convolutional neural network architecture: (1) the accessory attribute network (a-ANet) grasps the person ornament information for an accessory descriptor; (2) the body attribute network (b-ANet) captures the person region structure for a body descriptor; and (3) the color attribute network (c-ANet) forms the color descriptor to maintain the consistency of the color of the person(s). Inspired by the human visual processing mechanism, these descriptors (each “descriptor” corresponds to the attribute of an individual person) are integrated via a tree-based feature-selection method to construct a global “feature”, i.e., a multi-attribute descriptor of the person serving as the key to identify the person. Distance learning is then exploited to measure the person similarity for the final person re-identification. Experiments have been performed on four public datasets to evaluate the proposed framework: CUHK-01, CUHK-03, Market-1501, and VIPeR. The results indicate that (1) the multi-attribute feature outperforms most of the existing feature-representation methods by 5–10% at rank@1 in terms of the cumulative matching curve criterion; and (2) the time required for recognition is as low as O(n) for real-time person re-ID applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.