Abstract
Synthetic aperture radar (SAR) image simulation can provide SAR target images under different scenes and imaging conditions at a low cost. These simulation images can be applied to SAR target recognition, image interpretation, 3-D reconstruction, and many other fields. With the accumulation of high-resolution SAR images of targets under different imaging conditions, the simulation process should be benefited from these real images. Accurate simulation parameters are one of the keys to obtain high-quality simulation images. However, it takes a lot of time, energy, and resources to get simulation parameters from actual target measurement or adjusting manually. It is difficult to derive the analytical form of the relation between a SAR image and its simulation parameter, so nowadays the abundant real SAR images can hardly help the SAR simulation. In this article, a framework is proposed to obtain the relationship between SAR images and simulation parameters by training the deep neural network (DNN), so as to extract the simulation parameters from the real SAR image. Two DNNs, convolutional neural network (CNN), and generative adversarial network (GAN) are used to implement this framework. By modifying the network structures and setting reasonable training data, our DNNs can learn the relationship between image and simulation parameters more effectively. Experimental results show that the DNNs can extract the simulation parameters from the real SAR image, which can further improve the similarity of the simulation image while automating the setting of simulation parameters. Compared with CNN, the simulation parameters extracted by GAN can achieve better results at multiple azimuth angles.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Geoscience and Remote Sensing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.