Abstract

A convex minimization problem in the form of the sum of two proper lower-semicontinuous convex functions has received much attention from the community of optimization due to its broad applications to many disciplines, such as machine learning, regression and classification problems, image and signal processing, compressed sensing and optimal control. Many methods have been proposed to solve such problems but most of them take advantage of Lipschitz continuous assumption on the derivative of one function from the sum of them. In this work, we introduce a new accelerated algorithm for solving the mentioned convex minimization problem by using a linesearch technique together with a viscosity inertial forward–backward algorithm (VIFBA). A strong convergence result of the proposed method is obtained under some control conditions. As applications, we apply our proposed method to solve regression and classification problems by using an extreme learning machine model. Moreover, we show that our proposed algorithm has more efficiency and better convergence behavior than some algorithms mentioned in the literature.

Highlights

  • In this work, we are dealing with a convex minimization problem, which can be formulated as min{ f ( x ) + g( x )}, x∈ H (1)where f, g : H → R ∪ {+∞} are proper, lower-semicontinuous convex functions and H is a Hilbert space

  • We are motivated to establish a novel accelerated algorithm for solving a convex minimization problem (1), which employs a linesearch technique introduced by Cruz and Nghia [20] together with viscosity inertial forward–backward algorithm (VIFBA) [19]

  • We prove its strong convergence theorem under weaker assumptions on the control conditions than that of VIFBA

Read more

Summary

Introduction

We are dealing with a convex minimization problem, which can be formulated as min{ f ( x ) + g( x )}, x∈ H (1). One well-known method that has improved the convergence rate of (3) significantly is known as the fast iterative shrinkage-threshodling algorithm or FISTA. It was proposed by Beck and Teboulle [13], as seen in Algorithm 1. Most of the work related to a convex minimization problem assumes the L-Lipschitz continuity of O f This restriction can be relaxed using a linesearch technique. We are motivated to establish a novel accelerated algorithm for solving a convex minimization problem (1), which employs a linesearch technique introduced by Cruz and Nghia [20] together with VIFBA [19].

Preliminaries
Main Results
Applications to Data Classification and Regression Problems
Regression of a Sine Function
Data Classification
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call