Abstract

Phase differences in the far field of a coherently illuminated object are used to estimate the two-dimensional phase in the measurement plane of an imaging system. A previously derived phase-correlation function is used in a minimum-variance phase-estimation algorithm to map phase-difference measurements optimally to estimates of the phase on a grid of points in the measurement plane. Theoretical and computer-simulation comparisons between the minimum-variance phase estimator and conventional least-squares estimators are made. The minimum-variance phase estimator produces a lower aperture-averaged mean-square phase error for all values of a sampling parameter beta.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call