Abstract
Background/Objectives: The Conjugate Gradient (CG) methods are the well-known iterative methods use for finding solutions to nonlinear system equations. There is need to address the jamming phenomenal facing the current class of this methods. Methods/Statistical Analysis: In order to address the shortcomings, we work on the denominator of the Yao et al., CG method which is known to generate descent direction for objective functions by proposing an entire different CG coefficient which can easily switch in case jamming occurs by imposing some parameters thereby guarantee global convergence. Findings: The proposed CG formula performs better than classical methods as well as Yao et al. Under Wolfe line search condition, the convergence analysis of the proposed CG formula was established. Some benchmark problems from cute collections are used as basis of strength comparisons of the proposed formula against some other CG formulas. Effectiveness and efficiency of the obtained results for the proposed formula is clearly shown by adopting the performance profile of Dolan and More’ which is one of most acceptable techniques of strength comparisons among methods. Application: Mathematicians and Engineers who are interested in finding solutions to large scale nonlinear equations can apply the method leading to global optimization dealing with best possible solutions ever for given problems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.