Abstract

The segmented primary mirror telescope under the co-phasing condition can meet the observation requirements of high resolution. However, co-phase errors are always present, which seriously affects the imaging quality. The precise phase modulation requires that the root mean square error of wavefront is less than 𝜆 ⁄ 40. Therefore, the high-precision detection of tip-tilt error between the segments is one of the key technologies to realize the co-phase imaging. In this paper, we propose a simple and efficient tip-tilt error detection method based on single Convolution Neural Network (CNN). Without any preprocessing, the light intensity distribution images on the focal plane are used as the data set for training CNN. And a high-performance CNN model is built to learn the mapping between the tip-tilt errors and light intensity distribution images. After training, CNN can accurately capture the tip-tilt errors by inputting a single image of the light intensity distribution. The simulation model of a three-segment telescope system is established to test the accuracy and robustness of the method. Test results show that the method can achieve high-precision detection of tip-tilt error in a large detection range. This method can achieve a detection range of [-3𝜆,3𝜆] with an accuracy of 7.820×10-3𝜆RMS. The method is robust to the piston error and CCD noise: the tolerance of CCD noise is 5 dB and the tolerance of piston error is [-0.48 𝜆,0.48 𝜆]. This method is simple and does not require complex hardware. It can be widely applied in segmented and deployable primary mirror telescopes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.