Abstract

In this paper we define new classes of globally convergent block-coordinate techniques for the unconstrained minimization of a continuously differentiable function. More specifically, we first describe conceptual models of decomposition algorithms based on the interconnection of elementary operations performed on the block components of the variable vector. Then we characterize the elementary operations defined through a suitable line search or the global minimization in a component subspace. Using these models, we establish new results on the convergence of the nonlinear Gauss–Seidel method and we prove that this method with a two-block decomposition is globally convergent towards stationary points, even in the absence of convexity or uniqueness assumptions. In the general case of nonconvex objective function and arbitrary decomposition we define new globally convergent line-search-based schemes that may also include partial global inimizations with respect to some component. Computational aspects are discussed and, in particular, an application to a learning problem in a Radial Basis Function neural network is illustrated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call