Abstract

Synthetic Aperture Radar (SAR) system provides two-dimensional (range and azimuth), high-resolution radar images for various geological prospecting applications. Due to its high computational demands, most of the SAR imaging systems, either airborne or spaceborne, can only do offline processing, especially for spaceborne scenarios. Techniques based on Field Programmable Gate Arrays (FPGAs) provide a potential solution that satisfies all the computing constraints. However, to implement an entire SAR imaging processing system using floating-point arithmetic on FPGA is inefficient. One of the most challenging problem when using fixed-point processing is how to optimize the word length and reduce the impact of fixed-point errors on resolution. In this paper, we theoretically analyze the finite word length computing errors for SAR imaging system, and propose a mathematical error model. Then, an asymptotically optimal expression of the system level output noise-to-signal ratio is derived. Based on the expression, we apply the proposed methodology to various SAR imaging algorithms such as Range Doppler (RD) and Chirp Scaling (CS) algorithms. To validate the proposed method, we implement a SAR imaging system on an FPGA based platform. The run-time results show that the proposed method can achieve a usable image quality assessed by metrics such as Integrated Side Lobe Ratio (ISLR), Peak Side Lobe Ratio (PSLR) and Relative Mean Square Deviation (RMSD).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call