Abstract

This paper considers the fixed point problem for a nonexpansive mapping on a real Hilbert space and proposes novel line search fixed point algorithms to accelerate the search. The termination conditions for the line search are based on the well-known Wolfe conditions that are used to ensure the convergence and stability of unconstrained optimization algorithms. The directions to search for fixed points are generated by using the ideas of the steepest descent direction and conventional nonlinear conjugate gradient directions for unconstrained optimization. We perform convergence as well as convergence rate analyses on the algorithms for solving the fixed point problem under certain assumptions. The main contribution of this paper is to make a concrete response to an issue of constrained smooth convex optimization; that is, whether or not we can devise nonlinear conjugate gradient algorithms to solve constrained smooth convex optimization problems. We show that the proposed fixed point algorithms include ones with nonlinear conjugate gradient directions which can solve constrained smooth convex optimization problems. To illustrate the practicality of the algorithms, we apply them to concrete constrained smooth convex optimization problems, such as constrained quadratic programming problems and generalized convex feasibility problems, and numerically compare them with previous algorithms based on the Krasnosel’skiĭ-Mann fixed point algorithm. The results show that the proposed algorithms dramatically reduce the running time and iterations needed to find optimal solutions to the concrete optimization problems compared with the previous algorithms.

Highlights

  • IntroductionProblems of finding the zeros of monotone operators [ ], Proposition

  • Consider the following fixed point problem: Find x ∈ Fix(T) := x ∈ H : T x = x, ( . )Iiduka Fixed Point Theory and Applications (2016) 2016:77 where H stands for a real Hilbert space with inner product ·, · and its induced norm ·, T is a nonexpansive mapping from H into itself (i.e., T(x) – T(y) ≤ x – y (x, y ∈H)), and one assumes Fix(T) = ∅

  • This paper focuses on the Krasnosel’skiı-Mann algorithm, which has practical applications, such as analyses of dynamic systems governed by maximal monotone operators [ ] and nonsmooth convex variational signal recovery [ ], defined as follows: given the current iterate xn ∈ H and step size αn ∈ [, ], the iterate xn+ of the algorithm is xn+ := xn + αn T – xn

Read more

Summary

Introduction

Problems of finding the zeros of monotone operators [ ], Proposition . There are useful algorithms for solving Problem ), such as the Krasnosel’skiı-Mann algorithm [ ], Subchapter . [ , ], and the hybrid method [ ] (Solodov and Svaiter [ ] proposed the hybrid method to solve problems of finding the zeros of monotone operators). This paper focuses on the Krasnosel’skiı-Mann algorithm, which has practical applications, such as analyses of dynamic systems governed by maximal monotone operators [ ] and nonsmooth convex variational signal recovery [ ], defined as follows: given the current iterate xn ∈ H and step size αn ∈ [ , ], the iterate xn+ of the algorithm is xn+ := xn + αn T (xn) – xn.

Objectives
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.