Abstract

We propose general non-accelerated [The results for non-accelerated methods first appeared in December 2020 in the preprint (A. Agafonov, D. Kamzolov, P. Dvurechensky, and A. Gasnikov, Inexact tensor methods and their application to stochastic convex optimization, preprint 2020. arXiv:2012.15636)] and accelerated tensor methods under inexact information on the derivatives of the objective, analyse their convergence rate. Further, we provide conditions for the inexactness in each derivative that is sufficient for each algorithm to achieve a desired accuracy. As a corollary, we propose stochastic tensor methods for convex optimization and obtain sufficient mini-batch sizes for each derivative.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call