Abstract

This paper presents a novel efficient method for gridless line spectrum estimation problem with single snapshot and sparse signals, namely the gradient descent least squares (GDLS) method. Conventional single snapshot (a.k.a. single measure vector or SMV) line spectrum estimation methods either rely on smoothing techniques that sacrificing the range and/or azimuth resolution, or adopt the sparsity constraint and utilize compressed sensing (CS) method by defining prior grids and resulting in the off-grid problem. Recently emerged atomic norm minimization (ANM) methods achieved gridless SMV line spectrum estimation, but its computational complexity is extremely high; thus it is practically infeasible in real applications with large problem scales. Our proposed GDLS method reformulates the line spectrum estimations problem into a least squares (LS) estimation problem and solves the corresponding objective function via gradient descent algorithm in an iterative fashion with efficiency. The convergence guarantee, computational complexity, as well as performance analysis for evenly distributed antenna array case are discussed in this paper. Numerical simulations show that the proposed GDLS algorithm outperforms the state-of-the-art methods e.g., CS and ANM, in terms of estimation performances. It can completely avoid the off-grid problem, and its computational complexity is significantly lower than ANM. Our method has been tested in tomographic SAR (TomoSAR) imaging applications via simulated and real experiment data. Results show great potential of the proposed method in terms of better cloud point performance and eliminating the gridding effect.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call