Abstract

A pedestrian detector that uses visible and thermal infrared image pairs as the input has better detection performance than a detector that uses only visible image under challenging illumination conditions. With the aim to efficiently and effectively fuse complementary information from visible and thermal infrared images, this paper proposes an adaptive spatial pixel-level feature fusion network called the ASPFF Net, which can adaptively extract spatial pixel-level features from visible and infrared images for fusion. Specifically, first, two light networks with different weights are used to extract multi-scale features of visible and infrared images. Next, for features of the same scale but different modalities, the fusion weights of different spatial positions and pixels in the two feature maps are obtained by the spatial attention module (SAM) and pixel attention module (PAM). The original features of visible and infrared images are recalibrated by the fusion weights, and multi-scale fused feature layers are obtained. Finally, different scales of pedestrians are detected on the fused multi-scale feature layers. Compared with the other recent multispectral pedestrian detectors on the reasonable subset of the KAIST multispectral pedestrian detection dataset, the proposed detector is attractive in balancing speed and accuracy. The extensive experiments on the KAIST dataset demonstrate the effectiveness of the proposed method for the fusion of visible and infrared image in multispectral pedestrian detection.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.