Abstract

The subject on convergence analysis of optimization methods is an importantarea in optimization theory. However, convergence is not sufficient as a criterion forcomparing the efficiency of different algorithms. Therefore, we need another theoryto measure the difficulty of optimization problems and the efficiency of optimizationalgorithms. This theory is called complexity analysis theory of optimizationalgorithms.This paper is divided into five parts. The first section introduces the basic setups of complexity analysisframeworks, gives the definition, methods and examples of complexity analysis,and summarizes the complexity results. In the secondsection, the complexity analysis for the smooth optimization problem is introduced.We give the upper and lower complexity bounds for different optimization problems,and also introduce the framework of convergence analysis for the accelerated gradientmethod. In the third section, the upper bounds of the complexity of nonsmoothoptimization problems are introduced. We present the complexity analysisfor the subgradient method, the mirror descent method, the center-of-gravity method,and the ellipsoid method. In the fourth section, we introduce the complexity analysis for theconditional gradient method, study its upper and lower bounds of complexity andpresent the framework of the acceleratedconditional gradient method. In the fifth section, we introduce the complexity analysisof the stochastic optimization algorithm, and compare the confidence level for theconvergence under convex and nonconvexsettings.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call