Abstract

In order to improve the wavefront distortion correction performance of the classical stochastic parallel gradient descent (SPGD) algorithm, an optimized algorithm based on Nesterov-accelerated adaptive momentum estimation is proposed. It adopts a modified second-order momentum and a linearly varying gain coefficient to improve iterative stability. It integrates the Nesterov momentum term and the modified Adam optimizer to further improve the convergence speed, correct the direction of gradient descent in a timely fashion, and avoid falling into local extremum. Besides, to demonstrate the algorithm's performance, a wavefront sensorless adaptive optics system model is established using a 6×6 element deformable mirror as wavefront corrector. Simulation results show that, compared with the SPGD algorithm, the proposed algorithm converges faster, and its Strehl ratio after convergence is nearly 6.25 times that of the SPGD algorithm. Also, the effectiveness and superiority of the proposed algorithm are verified by comparing with two existing optimization algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call