Abstract The accumulation of aberrations along the optical path in a telescope produces distortions and speckles in the resulting images, limiting the performance of cameras at high angular resolution. It is important to achieve the highest possible sensitivity to faint sources, using both hardware and data analysis software. While analytic methods are efficient, real systems are better modeled numerically, but numerical models of complicated optical systems with many parameters can be hard to understand, optimize, and apply. Automatic differentiation or “backpropagation” software developed for machine-learning applications now makes calculating derivatives with respect to aberrations in arbitrary planes straightforward for any optical system. We apply this powerful new tool to the problem of high-angular-resolution astronomical imaging. Self-calibrating observables such as the “closure phase” or “bispectrum” have been widely used in optical and radio astronomy to mitigate optical aberrations and achieve high-fidelity imagery. Kernel phases are a generalization of closure phases valid in the limit of small phase errors. Using automatic differentiation, we reproduce existing kernel phase theory within this framework and demonstrate an extension to the case of a Lyot coronagraph, which is found to have self-calibrating combinations of speckles. which are resistant to phase noise, but only in the very high-wave-front-quality regime. As an illustrative example, we reanalyze Palomar adaptive optics observations of the binary α Ophiuchi, finding consistency between the new pipeline and the existing standard. We present a new Python package morphine that incorporates these ideas, with an interface similar to the popular package poppy, for optical simulation with automatic differentiation. These methods may be useful for designing improved astronomical optical systems by gradient descent.
Read full abstract