Abstract

Recently pedestrian detection has progressed significantly. However, detecting pedestrians of small scale or in heavy occlusions is still notoriously difficult. Besides, the generalization ability of pre-trained detectors across different datasets remains to be improved. Both of these issues can be attributed to insufficient training data coverage. To cope with this, we present an efficient data augmentation scheme by transferring pedestrians from other datasets into the target scene with a novel Attribute Preserving Generative Adversarial Networks (APGAN). The proposed methodology consists of two steps: pedestrian embedding and style transfer. The former step can simulate pedestrian images of various scale and occlusion, in any pose or background, thus greatly promoting the data variation. The latter step aims to make the generated samples more realistic while guarantee the data coverage. To achieve this goal, we propose APGAN, which pursues both good visual quality and attribute preserving after style transfer. With the proposed method, we can make effective sample augmentations to improve the generalization ability of the trained detectors and enhance its robustness to scale change and occlusions. Extensive experiment results validate the effectiveness and advantages of our method.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.