Abstract

Infrared (IR) small target detection is one of the most fundamental techniques in the infrared search and track (IRST) system. Due to the interferences caused by background clutter and image noise, conventional IR small target detection algorithms always suffer from a high false alarm rate and are unable to achieve robust performance in complex scenes. To accurately distinguish IR small target from the background, we propose a total variation (TV)-based interframe infrared patch-image model that regards the long-distance IR small target detection task as an optimization problem. First, the input IR image is converted to a patch-image that consists of a sparse target matrix and a low-rank background matrix. Then, the interframe similarity of target appearance is utilized to impose a temporal consistency constraint on the target matrix. Next, a TV regularization term is proposed to further alleviate the false alarms generated by noise. Finally, an alternating optimization algorithm using singular value decomposition (SVD) and accelerated proximal gradient (APG) is designed to mathematically solve the proposed model. Both qualitative and quantitative experiments implemented on real IR sequences demonstrate that our model outperforms other traditional IR small target methods in terms of the signal-to-clutter ratio gain (SCRG) and the background suppression factor (BSF).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call