Abstract

In this paper, we study stochastic composite problems where the objective can be the composition of an outer single-valued function and an inner vector-valued mapping. In this stochastic composite optimization, the inner mapping can be expressed as an expectation over random component mappings. In this study, we propose two algorithms to address the generality and possible singularities of this problem and bound their sample complexities for finding an ϵ-stationary point. The first algorithm is the prox-linear hybrid stochastic gradient algorithm, which may achieve sample complexities of O(ϵ2τ−5/2) and O(ϵτ−3/2) for the component mappings and their Jacobians respectively, where τ∈[0,1]. The second algorithm is the normalized proximal hybrid stochastic gradient algorithm, which takes advantage of the special structure of the regularizer. This algorithm may achieve sample complexities of O(ϵ2τ−4) for both the component mappings and the Jacobians, where τ∈[5/4,7/4]. Numerical experiments prove that the two proposed algorithms are quite competitive with other existing algorithms. A real-life application in sparse portfolio selection problems is also promising.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call