Abstract

Conjugate gradient methods are efficient for smooth optimization problems, while there are rare conjugate gradient based methods for solving a possibly nondifferentiable convex minimization problem. In this paper by making full use of inherent properties of Moreau-Yosida regularization and descent property of modified conjugate gradient method we propose a modified Fletcher-Reeves-type method for nonsmooth convex minimization. It can be applied to solve large-scale nonsmooth convex minimization problem due to lower storage requirement. The algorithm is globally convergent under mild conditions.

Highlights

  • Let f : Rn → R be a possibly nondifferentiable convex function and consider unconstrained optimization problem of the form min f (x). (1) x∈RnAssociated with problem (1) is the problem min F (x), (2)ISSN 2310-5070 ISSN 2311-004XCopyright ⃝c 2014 International Academic Press where F : Rn → R is the so-called Moreau-Yosida regularization of f, which is defined byF (x) = min {f (z) + 1 ∥ z − x ∥2}, z∈Rn2λ where λ is a positive parameter and ∥ · ∥ denotes the Euclidean norm

  • We propose a conjugate gradient based method for minimizing Moreau-Yosida regularization F, with a line search on approximate value of the function F instead of its exact value

  • We will focus on the MFR method which is a descent conjugate gradient method, proposed by Zhang, Zhou and Li [20] for solving unconstrained optimization

Read more

Summary

Introduction

Let f : Rn → R be a possibly nondifferentiable convex function and consider unconstrained optimization problem of the form min f (x). F has a Lipschitz continuous gradient [9] These features motivate us to solve problem (1) through the Moreau-Yosida regularization, when f is nondifferentiable. The conjugate gradient methods are welcome methods for smooth unconstrained optimization problems. We propose a conjugate gradient based method for minimizing Moreau-Yosida regularization F , with a line search on approximate value of the function F instead of its exact value. Yuan, Wei and Li [18] propose a modified Polak-RibierePolyak conjugate gradient algorithm for nonsmooth convex programs, [18] and this paper have common feature that they both propose algorithms for problem (1) by means of Moreau-Yosida regularization and the search directions satisfy the sufficient descent property, but they have different line search technique. Throughout this paper, ⟨·, ·⟩ denotes inner product of two vectors, and g(x) denotes the gradient of F (x)

Derivation of MFR Type Algorithm
Global Convergence of MFR Type Algorithm
Concluding remarks
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.