AbstractA novel approach is proposed in this paper to reconstruct the far‐field radiation pattern from the phaseless electric field of an antenna scanned on a single near‐field sphere. It adopts the dipole equivalence approach to project the near‐field electric field into a spherically distributed array of electric dipoles. A term representing the error from the linearly correlated portion in the least‐square problem associated with the dipole equivalence, namely the linear correlation error, is introduced. It is demonstrated that by iterative search to minimize the linear correlation error using the covariance matrix adaptation evolution strategy, near‐field phase distributions can be found efficiently from the magnitude‐only near‐field, and the far‐field radiation pattern can be computed. Two representative case studies are given here to validate the proposed method. Results demonstrate good agreements between computations and simulations.
Read full abstract