Abstract
A number of practical problems in science and engineering can be converted into a system of nonlinear equations and therefore, it is imperative to develop efficient methods for solving such equations. Due to their nice convergence properties and low storage requirements, conjugate gradient methods are considered among the most efficient for solving large-scale nonlinear equations. In this paper, a modified conjugate gradient method is proposed based on a projection technique and a suitable line search strategy. The proposed method is matrix-free and its sequence of search directions satisfies sufficient descent condition. Under the assumption that the underlying function is monotone and Lipschitzian continuous, the global convergence of the proposed method is established. The method is applied to solve some benchmark monotone nonlinear equations and also extended to solve ℓ 1 -norm regularized problems to reconstruct a sparse signal in compressive sensing. Numerical comparison with some existing methods shows that the proposed method is competitive, efficient and promising.
Highlights
Let Rn be an n−dimensional Euclidean space with inner product ·, · and norm ·
We proved the global convergence of the algorithm using a more general line search strategy that is mostly utilized in literature
A Hestenes–Stiefel-like derivative-free method with spectral parameter for nonlinear monotone equations has been proposed based on a suitable line search strategy and projection technique
Summary
Let Rn be an n−dimensional Euclidean space with inner product ·, · and norm ·. Xiao and Zhu [15] combined the conjugate gradient method of Hager and Zhang [16] and the projection technique of Solodov and Svaiter [17] to solve constrained system of monotone nonlinear equations Their numerical experiments show that the method works well and its convergence analysis was established under some reasonable assumptions. The sequence of the search directions generated by their algorithms satisfy sufficient descent property and under the assumption that the underlying function is Lipschitzian continuous, the convergence analysis of the method was established. Let the sequence of iterates {xk} and the search direction {dk} be generated by Algorithm 1, there always exists a step-size αk satisfying the line search defined by (14) for any k ≥ 0. We can regard the proposed HSS algorithm as more efficient than CGD, PDY and MFRM methods with respect to the numerical experiments performed
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.