Abstract

Author(s): Chen, L; Hu, X; Wise, SM | Abstract: The full approximation storage (FAS) scheme is a widely used multigrid method for nonlinear problems. In this paper, a new framework to design and analyze FAS-like schemes for convex optimization problems is developed. The new method, the fast subspace descent (FASD) scheme, which generalizes classical FAS, can be recast as an inexact version of nonlinear multigrid methods based on space decomposition and subspace correction. The local problem in each subspace can be simplified to be linear and one gradient descent iteration (with an appropriate step size) is enough to ensure a global linear (geometric) convergence of FASD for convex optimization problems.

Highlights

  • We emphasize that our work represents a theoretical advance for the convergence analysis of full approximation storage (FAS)-type schemes, and is algorithmically simpler, and even more flexible, than the original FAS. Both theoretically and numerically, each local nonlinear problem can be approximated by a linear problem, and, the computational cost is reduced significantly

  • We aim to prove a linear reduction of the energy difference for one iteration of the subspace optimization (SSO) algorithm: (10)

  • The Lipschitz constant L is used in the step size αi which can be replaced by a local Lipschitz constant for the scalar function fi(α), for α ∈ (0, αi∗) and popular line search algorithms can be used

Read more

Summary

INTRODUCTION

In [27], Tai and Xu considered some unconstrained convex optimization problems and developed global and uniform convergence estimates for a class of subspace correction iterative methods. Their approach is based on an abstract space decomposition which is assumed to satisfy the so-called stable decomposition property and strengthened Cauchy Schwarz inequality. We emphasize that our work represents a theoretical advance for the convergence analysis of FAS-type schemes, and is algorithmically simpler, and even more flexible, than the original FAS We show that, both theoretically and numerically, each local nonlinear problem can be approximated by a linear problem, and, the computational cost is reduced significantly.

PROBLEM AND ASSUMPTIONS
SUCCESSIVE SUBSPACE OPTIMIZATION METHODS
FAST SUBSPACE DESCENT METHOD WITH EXACT LINE SEARCH
FAST SUBSPACE DESCENT METHOD WITH APPROXIMATE LINE SEARCH
ORIGINAL FAS METHOD
APPLICATION AND NUMERICAL EXPERIMENTS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call