Abstract
With the development of generative adversarial networks, the super-resolution technique of reconstructing a high-resolution image from a low-resolution has achieved excellent resolution results. However, small, low-resolution images are widespread, such as images taken by a thermal camera or with a lens far from the target. Extremely small target image super-resolution is a challenging problem. The main reason is that the small infrared target has fewer pixels and weaker features. The current optimization methods for the tiny target are mainly based on multi-scale feature fusion or super-resolution enhancement. The low-resolution images characterizing small targets are usually obtained by downsampling with high-resolution images during training, which is different from the style of the tiny target in actual detection applications, resulting in poor resolution. In order to solve the problem, we propose a new resolution network: Style Transformation Super-Resolution Generative Adversarial Network (STSRGAN). It contains two sub-networks: one is style transformation GAN to convert the style of the image, and the other is super-resolution GAN. STSRGAN transforms a blurry infrared small target into a clear target with a distribution similar to the training set. Then the resolution can be increased to get a better enhancement effect. The discriminator distinguishes whether the input comes from the generator or the actual image to assist in generating a better super-resolution image. Meanwhile, we produced an infrared Unmanned Aerial Vehicle (UAV) small target dataset with target pixels below 16 × 16. Our method proves better resolution enhancement of small IR targets and shows superior performance over other methods through experiments.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.