Abstract

In this paper, we study local convergence of high-order Tensor Methods for solving convex optimization problems with composite objective. We justify local superlinear convergence under the assumption of uniform convexity of the smooth component, having Lipschitz-continuous high-order derivative. The convergence both in function value and in the norm of minimal subgradient is established. Global complexity bounds for the Composite Tensor Method in convex and uniformly convex cases are also discussed. Lastly, we show how local convergence of the methods can be globalized using the inexact proximal iterations.

Highlights

  • Motivation In Nonlinear Optimization, it seems to be a natural idea to increase the performance of numerical methods by employing high-order oracles

  • In this paper, we study local convergence of high-order Tensor Methods for solving convex optimization problems with composite objective

  • Global complexity bounds for the Composite Tensor Method in convex and uniformly convex cases are discussed

Read more

Summary

Introduction

Motivation In Nonlinear Optimization, it seems to be a natural idea to increase the performance of numerical methods by employing high-order oracles. The main obstacle to this approach consists in a prohibiting complexity of the corresponding Taylor approximations formed by the high-order multidimensional polynomials, which are difficult to store, handle, and minimize. If we go just one step above the. The research results of this paper were obtained in the framework of ERC Advanced Grant 788368

B Nikita Doikov
Local convergence
Global complexity bounds
Application to proximal methods
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call