Abstract

A number of practical problems in science and engineering can be converted into a system of nonlinear equations and therefore, it is imperative to develop efficient methods for solving such equations. Due to their nice convergence properties and low storage requirements, conjugate gradient methods are considered among the most efficient for solving large-scale nonlinear equations. In this paper, a modified conjugate gradient method is proposed based on a projection technique and a suitable line search strategy. The proposed method is matrix-free and its sequence of search directions satisfies sufficient descent condition. Under the assumption that the underlying function is monotone and Lipschitzian continuous, the global convergence of the proposed method is established. The method is applied to solve some benchmark monotone nonlinear equations and also extended to solve ℓ 1 -norm regularized problems to reconstruct a sparse signal in compressive sensing. Numerical comparison with some existing methods shows that the proposed method is competitive, efficient and promising.

Highlights

  • Let Rn be an n−dimensional Euclidean space with inner product ·, · and norm ·

  • We proved the global convergence of the algorithm using a more general line search strategy that is mostly utilized in literature

  • A Hestenes–Stiefel-like derivative-free method with spectral parameter for nonlinear monotone equations has been proposed based on a suitable line search strategy and projection technique

Read more

Summary

Introduction

Let Rn be an n−dimensional Euclidean space with inner product ·, · and norm ·. Xiao and Zhu [15] combined the conjugate gradient method of Hager and Zhang [16] and the projection technique of Solodov and Svaiter [17] to solve constrained system of monotone nonlinear equations Their numerical experiments show that the method works well and its convergence analysis was established under some reasonable assumptions. The sequence of the search directions generated by their algorithms satisfy sufficient descent property and under the assumption that the underlying function is Lipschitzian continuous, the convergence analysis of the method was established. Let the sequence of iterates {xk} and the search direction {dk} be generated by Algorithm 1, there always exists a step-size αk satisfying the line search defined by (14) for any k ≥ 0. We can regard the proposed HSS algorithm as more efficient than CGD, PDY and MFRM methods with respect to the numerical experiments performed

Second Experiment on Signal Processing
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call