Abstract

Recently, convolutional neural networks (CNNs) have brought impressive improvements for object detection. However, detecting targets in infrared images still remains challenging, because the poor texture information, low resolution and high noise levels of the thermal imagery restrict the feature extraction ability of CNNs. In order to deal with these difficulties in the feature extraction, we propose a novel backbone network named Deep-IRTarget, composing of a frequency feature extractor, a spatial feature extractor and a dual-domain feature resource allocation model. Hypercomplex Infrared Fourier Transform is developed to calculate the infrared intensity saliency by designing hypercomplex representations in the frequency domain, while a convolutional neural network is invoked to extract feature maps in the spatial domain. Features from the frequency domain and spatial domain are stacked to construct Dual-domain features. To efficiently integrate and recalibrate them, we propose a Resource Allocation model for Features (RAF). The well-designed channel attention block and position attention block are used in RAF to respectively extract interdependent relationships among channel and position dimensions, and capture channel-wise and position-wise contextual information. Extensive experiments are conducted on three challenging infrared imagery databases. We achieve 10.14%, 9.1% and 8.05% improvement on mAP scores, compared to the current state of the art method on MWIR, BITIR and WCIR respectively.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.