In ground-based high-resolution solar observation, ground-layer adaptive optics (GLAO) offers a significant advancement by compensating for wavefront aberrations caused by near-ground atmospheric turbulence. GLAO overcomes the field-of-view limitations of conventional adaptive optics (AO), which is constrained by turbulence anisoplanatism. In conventional solar AO, wavefront sensing requires correlation calculations across an extended guide region to extract wavefront information. However, the cross correlation algorithm with a larger guide region size smooths the wavefront distortions caused by high-altitude turbulence, reducing the accuracy of single-line wavefront detection. This effect is particularly pronounced in GLAO systems that utilize large guide regions for ground layer wavefront sensing. This paper proposed a theoretical model for GLAO wavefront sensing using correlating Shack-Hartmann wavefront sensors targeting extended objects. We quantitatively analyze the impact of the guide region size on wavefront detection accuracy and validate our findings through simulation experiments. Simulation results indicate that for a 1-meter telescope, the difference between the root mean square (RMS) of detected aberrations differs from the RMS of ground-layer aberrations by less than 1/30 wavelength when using the CP7 model at 60" and the MK8 model at 140". Results also indicate that increasing the guide region effectively enables the detection of ground-layer wavefront aberrations. This finding not only provides valuable guidance for achieving accurate ground-layer wavefront sensing in GLAO systems, but also offers what we believe to be new solutions for optimizing GLAO system performance.
Read full abstract