Abstract

Infrared (IR) small target detection is one of the most fundamental techniques in the infrared search and track (IRST) system. Due to the interferences caused by background clutter and image noise, conventional IR small target detection algorithms always suffer from a high false alarm rate and are unable to achieve robust performance in complex scenes. To accurately distinguish IR small target from the background, we propose a total variation (TV)-based interframe infrared patch-image model that regards the long-distance IR small target detection task as an optimization problem. First, the input IR image is converted to a patch-image that consists of a sparse target matrix and a low-rank background matrix. Then, the interframe similarity of target appearance is utilized to impose a temporal consistency constraint on the target matrix. Next, a TV regularization term is proposed to further alleviate the false alarms generated by noise. Finally, an alternating optimization algorithm using singular value decomposition (SVD) and accelerated proximal gradient (APG) is designed to mathematically solve the proposed model. Both qualitative and quantitative experiments implemented on real IR sequences demonstrate that our model outperforms other traditional IR small target methods in terms of the signal-to-clutter ratio gain (SCRG) and the background suppression factor (BSF).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.