Abstract

In order to improve the wavefront distortion correction performance of the classical stochastic parallel gradient descent (SPGD) algorithm, an optimized algorithm based on Nesterov-accelerated adaptive momentum estimation is proposed. It adopts a modified second-order momentum and a linearly varying gain coefficient to improve iterative stability. It integrates the Nesterov momentum term and the modified Adam optimizer to further improve the convergence speed, correct the direction of gradient descent in a timely fashion, and avoid falling into local extremum. Besides, to demonstrate the algorithm's performance, a wavefront sensorless adaptive optics system model is established using a 6×6 element deformable mirror as wavefront corrector. Simulation results show that, compared with the SPGD algorithm, the proposed algorithm converges faster, and its Strehl ratio after convergence is nearly 6.25 times that of the SPGD algorithm. Also, the effectiveness and superiority of the proposed algorithm are verified by comparing with two existing optimization algorithms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.