Abstract

Fusion of multispectral data in object detection is inevitable in order to cover various environments. However, there is still insufficient research on how to fuse information between two-stream multispectral networks. This paper proposes a novel multispectral interaction convolutional neural network (MICNN) for fusing information between multispectral networks. Unlike existing fusion methods that do not have interactions between multispectral networks, the proposed MICNN reflects information from each multispectral network in the training process. The MICNN is a simple way of forcing interactions by exchanging weights of feature maps between multispectral networks. It does not require any parameters to learn, and does not need additional computations or modifications to the network structure. We verified the effectiveness of the MICNN with the KAIST multispectral pedestrian dataset and YU far-infrared (FIR) pedestrian dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call