Reducing Impulse Noise in Images Using an Improved Formula Conjugate Gradient Method

  • Abstract
  • Literature Map
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

The conjugate formula's significance is frequently emphasised by conjugate gradient approaches. In this paper, a novel conjugate coefficient for the conjugate gradient technique is introduced using a quadratic model and conjugacy condition. This coefficient is used to address image restoration issues. The algorithms described in this study exhibit the essential descent property and global convergence. The new approach is far better, as shown by numerical experiments. It has been demonstrated that the innovative conjugate gradient strategy outperforms the conventional FR conjugate gradient method. The great improvement of the new method has been confirmed by numerical testing. It has been demonstrated that the novel conjugate gradient approach outperforms the widely used FR approach.

Similar Papers
  • Research Article
  • 10.29020/nybg.ejpam.v16i3.4849
Image Impulse Noise Reduction Using a Conjugate Gradient of Alternative Parameter
  • Jul 30, 2023
  • European Journal of Pure and Applied Mathematics
  • Hawraz N Jabbar + 2 more

Conjugate gradient approaches emphasise the conjugate formula. This study creates a new conjugate coefficient for the conjugate gradient approach to restore pictures using Perry’s conjugacy condition and a quadratic model. Algorithms have global convergence and descent. The new technique performed better in numerical testing. The new conjugate gradient technique outperforms the FR method. The new technique performed better in numerical testing. The new conjugate gradient technique outperforms the FR method.

  • Research Article
  • Cite Count Icon 14
  • 10.1080/10556788.2014.1001511
A new class of efficient and globally convergent conjugate gradient methods in the Dai–Liao family
  • Mar 23, 2015
  • Optimization Methods and Software
  • M Reza Peyghami + 2 more

In this paper, we propose a new conjugate gradient (CG) method which belongs to the CG methods of Dai–Liao family [New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. Math. Optim. 43 (2001), pp. 87–101]. Babaie-Kafaki et al. [Two new conjugate gradient methods based on modified secant equations, J. Comput. Appl. Math. 234 (2010), pp. 1374–1386] made some modifications on the Yabe and Takano's CG approach [Global convergence properties of nonlinear conjugate gradient methods with modified secant condition, Comput. Optim. Appl. 28 (2004), pp. 203–225] and received some appealing results in theory and practice. Here, we introduce an efficient updating rule for the parameters of the Yabe and Takano's CG algorithm. Under some standard assumptions, we establish the global convergence property of the new suggested algorithm on uniformly convex and general functions. Numerical results on some testing problems from CUTEr collection show the priority of the proposed method to some existing CG methods in practice.

  • Research Article
  • 10.1080/02522667.2022.2122199
Enriched formulas to conjugate gradient method for removing impulse noise images
  • Nov 17, 2022
  • Journal of Information and Optimization Sciences
  • Basim A Hassan + 1 more

The formula conjugate is usually the focal point in conjugate gradient techniques. In this paper, the Perry’s conjugacy condition and quadratic model are used to derive a new coefficient conjugate for the conjugate gradient technique, which is used to solve picture restoration issues. The algorithms show global convergence and have the required descent property. The new technique has showed substantial improvement in numerical testing. It has been demonstrated that the novel conjugate gradient approach outperforms the traditional FR conjugate gradient method. The new technique has showed substantial improvement in numerical testing. It has been demonstrated that the novel conjugate gradient approach outperforms the traditional FR conjugate gradient method.

  • Conference Article
  • 10.1109/icdsic56987.2022.10076040
Robust Parameters for Conjugate Gradient Method in Unconstrained Optimization
  • Nov 1, 2022
  • Basim A Hassan + 2 more

The conjugate gradient approach is used to resolve unrestricted optimization problems, uses the quadratic model for the objective function to generate a new coefficient conjugate. The algorithms are globally convergent and have the necessary descent property. In numerical testing, the new approach has shown to be quite effective. The results show that the novel conjugate gradient approach best the conventional FR conjugate gradient method.

  • Research Article
  • 10.62054/ijdm/0101.05
A Dai-Liao Hybrid PRP and DY Schemes for Unconstrained Optimization
  • Mar 20, 2024
  • International Journal of Development Mathematics (IJDM)
  • Huzaifa A. Babando + 3 more

This article presents a new conjugate gradient (CG) method that requires first-order derivatives but overcomes the slow convergence issue associated with the steepest descent method and does not require the computation of second-order derivatives, as needed in the Newton method. The CG update parameter is suggested from the extended conjugacy condition as a convex combination of Polak, Ribiére, and Polyak (PRP) and Dai and Yuan (DY) algorithms by employing the optimal choice of the modulating parameter 't'. Numerical computations show that the algorithm is robust and efficient based on the number of iterations and CPU time. The scheme converges globally under Wolfe line search and adopts an inexact line search to obtain the step-size that generates a descent property, without requiring exalt computation of the step size. Conjugate gradient method, Descent property, Dai-Liao conjugacy condition, Global convergence, Numerical methods

  • Research Article
  • Cite Count Icon 17
  • 10.1080/00207160.2013.862236
New three-term conjugate gradient method with guaranteed global convergence
  • Jan 17, 2014
  • International Journal of Computer Mathematics
  • J.K Liu + 1 more

The conjugate gradient method is one of the most effective methods to solve the unconstrained optimization problems. In this paper, we develop a new three-term conjugate gradient (TTCG) method by applying the Powell symmetrical technique to the Hestenes–Stiefel method. The proposed method satisfies both the sufficient descent property and the conjugacy condition , which do not rely on any line search. Under the standard Wolfe line search, the global convergence of the proposed method is also established. The numerical results also show that the proposed method is very effective and interesting by comparing with other TTCG methods using a classical set of test problems.

  • Research Article
  • Cite Count Icon 1
  • 10.1080/02522667.2022.2103300
Some new conjugate gradient methods for solving unconstrained optimization problems
  • May 19, 2022
  • Journal of Information and Optimization Sciences
  • Basim A Hassan + 2 more

Conjugate gradient algorithms come in a wide range of flavors. Conjugate gradient techniques primarily concentrate on the coefficient conjugate. We introduce a novel conjugate gradient approach that computes the parameter by using Newton updates. In addition, we have demonstrated that our conjugate gradient algorithms are globally convergent and descent property. For the specified test issues in [1] the performance profiles revealed that the novel conjugate gradient approach is effective and efficient.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 1
  • 10.15587/1729-4061.2022.254017
A new modified HS algorithm with strong Powell-Wolfe line search for unconstrained optimization
  • Apr 28, 2022
  • Eastern-European Journal of Enterprise Technologies
  • Ghada Moayid Al-Naemi

Optimization is now considered a branch of computational science. This ethos seeks to answer the question «what is best?» by looking at problems where the quality of any answer can be expressed numerically. One of the most well-known methods for solving nonlinear, unrestricted optimization problems is the conjugate gradient (CG) method. The Hestenes and Stiefel (HS-CG) formula is one of the century’s oldest and most effective formulas. When using an exact line search, the HS method achieves global convergence; however, this is not guaranteed when using an inexact line search (ILS). Furthermore, the HS method does not always satisfy the descent property. The goal of this work is to create a new (modified) formula by reformulating the classic parameter HS-CG and adding a new term to the classic HS-CG formula. It is critical that the proposed method generates sufficient descent property (SDP) search direction with Wolfe-Powell line (sWPLS) search at every iteration, and that global convergence property (GCP) for general non-convex functions can be guaranteed. Using the inexact sWPLS, the modified HS-CG (mHS-CG) method has SDP property regardless of line search type and guarantees GCP. When using an sWPLS, the modified formula has the advantage of keeping the modified scalar non-negative sWPLS. This paper is significant in that it quantifies how much better the new modification of the HS performance is when compared to standard HS methods. As a result, numerical experiments between the mHSCG method using the sWPL search and the standard HS optimization problem show that the CG method with the mHSCG conjugate parameter is more robust and effective than the CG method without the mHSCG parameter

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 8
  • 10.13189/ms.2021.090103
The Performance Analysis of a New Modification of Conjugate Gradient Parameter for Unconstrained Optimization Models
  • Jan 1, 2021
  • Mathematics and Statistics
  • I M Sulaiman + 4 more

Conjugate Gradient (CG) method is the most prominent iterative mathematical technique that can be useful for the optimization of both linear and non-linear systems due to its simplicity, low memory requirement, computational cost, and global convergence properties. However, some of the classical CG methods have some drawbacks which include weak global convergence, poor numerical performance both in terms of number of iterations and the CPU time. To overcome these drawbacks, researchers proposed new variants of the CG parameters with efficient numerical results and nice convergence properties. Some of the variants of the CG method include the scale CG method, hybrid CG method, spectral CG method, three-term CG method, and many more. The hybrid conjugate gradient (CG) algorithm is among the efficient variant in the class of the conjugate gradient methods mentioned above. Some interesting features of the hybrid modifications include inherenting the nice convergence properties and efficient numerical performance of the existing CG methods. In this paper, we proposed a new hybrid CG algorithm that inherits the features of the Rivaie et al. (RMIL*) and Dai (RMIL+) conjugate gradient methods. The proposed algorithm generates a descent direction under the strong Wolfe line search conditions. Preliminary results on some benchmark problems show that the proposed method efficient and promising.

  • Research Article
  • Cite Count Icon 5
  • 10.1016/j.apnum.2024.04.014
Another modified version of RMIL conjugate gradient method
  • May 3, 2024
  • Applied Numerical Mathematics
  • Osman Omer Osman Yousif + 1 more

Another modified version of RMIL conjugate gradient method

  • Research Article
  • Cite Count Icon 939
  • 10.1137/1011036
Convergence Conditions for Ascent Methods
  • Apr 1, 1969
  • SIAM Review
  • Philip Wolfe

Convergence Conditions for Ascent Methods

  • Research Article
  • 10.31926/but.mif.2025.5.67.2.14
A new hybrid conjugate gradient method as a convex combination methods
  • Jun 5, 2025
  • Bulletin of the Transilvania University of Brasov. Series III: Mathematics and Computer Science
  • M Abdelhamid + 2 more

The conjugate gradient (CG) method is a widely employed algorithm for solving large-scale unconstrained optimization problems due to its fast convergence and efficient memory usage. In this paper, we suggest a new hybrid nonlinear conjugate gradient method, which the conjugate gradient coefficient βk is a convex combination of βkNPRP and βkDY . The parameter θk is computed in such a way that the conjugacy condition is satisfied. With the strong Wolfe line search, the descent property and global convergence of the new hybrid method are proved. The numerical results also show that our method is robust and efficient.

  • Research Article
  • Cite Count Icon 22
  • 10.1007/s11075-014-9845-9
A new three-term conjugate gradient algorithm for unconstrained optimization
  • Apr 12, 2014
  • Numerical Algorithms
  • Neculai Andrei

A new three-term conjugate gradient algorithm which satisfies both the descent condition and the conjugacy condition is presented. The algorithm is obtained by minimization the one-parameter quadratic model of the objective function in which the symmetrical approximation of the Hessian matrix satisfies the general quasi-Newton equation. The search direction is obtained by symmetrization of the iteration matrix corresponding to the solution of the quadratic model minimization. Using the general quasi-Newton equation the search direction includes a parameter which is determined by the minimization of the condition number of the iteration matrix. It is proved that this direction satisfies both the conjugacy and the descent condition. The new approximation of the minimum is obtained by the general Wolfe line search using by now a standard acceleration technique. Under standard assumptions, for uniformly convex functions the global convergence of the algorithm is proved. The numerical experiments using 800 large-scale unconstrained optimization test problems show that minimization of the condition number of the iteration matrix lead us to a value of the parameter in the search direction able to define a competitive three-term conjugate gradient algorithm. Numerical comparisons of this variant of the algorithm versus known conjugate gradient algorithms ASCALCG, CONMIN, TTCG and THREECG, as well as the limited memory quasi-Newton algorithm LBFGS (m = 5) and the truncated Newton TN show that our algorithm is indeed more efficient and more robust.

  • Research Article
  • Cite Count Icon 1
  • 10.11591/ijeecs.v28.i2.pp1184-1191
Global convergence of a modified RMIL+ nonlinear conjugate gradient method with strong wolfe
  • Nov 1, 2022
  • Indonesian Journal of Electrical Engineering and Computer Science
  • Abdelrhaman Abashar + 3 more

<span lang="EN-US">Nonlinear conjugate gradient (CG) methods are extensively used as an important technique for addressing large-scale unconstrained optimization problems which are arise in many aspects of science, engineering, and economics. That is due to their simplicity, convergence properties, and low memory requirements. To generate a new approximation solution in each iteration, the CG methods usually implement under the strong Wolfe line search. For good performance, many studies have been carried out to modify well-known CG methods. In this paper, we did some modifications on one of CG method called RMIL+ in order to obtain a new CG method possesses the sufficient descent property and the global convergence under strong Wolfe line search. The numerical results demonstrate that the suggested method outperforms other CG methods.</span>

  • PDF Download Icon
  • Research Article
  • 10.13189/ms.2022.100202
A New Algorithm for Spectral Conjugate Gradient in Nonlinear Optimization
  • Mar 1, 2022
  • Mathematics and Statistics
  • Ahmed Anwer Mustafa

CJG is a nonlinear conjugation gradient. Algorithms have been used to solve large-scale unconstrained enhancement problems. Because of their minimal memory needs and global convergence qualities, they are widely used in a variety of fields. This approach has lately undergone many investigations and modifications to enhance it. In our daily lives, the conjugate gradient is incredibly significant. For example, whatever we do, we strive for the best outcomes, such as the highest profit, the lowest loss, the shortest road, or the shortest time, which are referred to as the minimum and maximum in mathematics, and one of these ways is the process of spectral gradient descent. For multidimensional unbounded objective function, the spectrum conjugated gradient (SCJG) approach is a strong tool. In this study, we describe a revolutionary SCG technique in which performance is quantified. Based on assumptions, we constructed the descent condition, sufficient descent theorem, conjugacy condition, and global convergence criteria using a robust Wolfe and Powell line search. Numerical data and graphs were constructed utilizing benchmark functions, which are often used in many classical functions, to demonstrate the efficacy of the recommended approach. According to numerical statistics, the suggested strategy is more efficient than some current techniques. In addition, we show how the unique method may be utilized to improve solutions and outcomes.

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon