Abstract

The paper proposes a modification to the simultaneous perturbation stochastic approximation (SPSA) methods based on comparisons made between the first order and the second order SPSA (1SPSA and 2SPSA) algorithms from the perspective of loss function Hessian. At finite iterations, the convergence rate depends on matrix conditioning of the loss function Hessian. It is shown that 2SPSA converges more slowly for a loss function with an M-conditioned Hessian than the one with a well-conditioned Hessian. On the other hand, the convergence rate of 1SPSA is less sensitive to the matrix conditioning of loss function Hessians. The modified 2SPSA (M2SPSA) eliminates the error amplification caused by the inversion of an ill-conditioned Hessian at finite iterations which leads to significant improvements in its convergence rate in problems with an ill-conditioned Hessian matrix. Asymptotically, the efficiency analysis shows that M2SPSA is also superior to 2SPSA in terms of its convergence rate coefficients. It is shown that for the same asymptotic convergence rate, the ratio of the mean square errors for M2SPSA to 2SPSA is always less than one except for a perfectly conditioned Hessian.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.