Abstract

In this paper, we propose a single timescale stochastic quasi-Newton method for solving the stochastic optimization problems. The objective function of the problem is a composition of two smooth functions and their derivatives are not available. The algorithm sets to approximate sequences to estimate the gradient of the composite objective function and the inner function. The matrix correction parameters are given in BFGS update form for avoiding the assumption that Hessian matrix of objective is positive definite. We show the global convergence of the algorithm. The algorithm achieves the complexity to find an approximate stationary point and ensure that the expectation of the squared norm of the gradient is smaller than the given accuracy tolerance ϵ. The numerical results of nonconvex binary classification problem using the support vector machine and a multicall classification problem using neural networks are reported to show the effectiveness of the algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call