Abstract

In this paper, we propose to use spectral proximal method to solve sparse optimization problems. Sparse optimization refers to an optimization problem involving the ι0 -norm in objective or constraints. The previous research showed that the spectral gradient method is outperformed the other standard unconstrained optimization methods. This is due to spectral gradient method replaced the full rank matrix by a diagonal matrix and the memory decreased from Ο(n2) to Ο(n). Since ι0-norm term is nonconvex and non-smooth, it cannot be solved by standard optimization algorithm. We will solve the ι0 -norm problem with an underdetermined system as its constraint will be considered. Using Lagrange method, this problem is transformed into an unconstrained optimization problem. A new method called spectral proximal method is proposed, which is a combination of proximal method and spectral gradient method. The spectral proximal method is then applied to the ι0-norm unconstrained optimization problem. The programming code will be written in Python to compare the efficiency of the proposed method with some existing methods. The benchmarks of the comparison are based on number of iterations, number of functions call and the computational time. Theoretically, the proposed method requires less storage and less computational time.

Highlights

  • There has been an increased interest in the general field of sparsity in the past few years, [1, 2, 3]

  • To fill in the gap, we propose to solve the residue by applying spectral gradient method which is proposed by Sim et al [29]

  • In order to compare the efficiency of spectral proximal method with other methods, we developed an executable code using Python 3.7 software

Read more

Summary

Introduction

There has been an increased interest in the general field of sparsity in the past few years, [1, 2, 3]. Solving the sparse optimization to underdetermined linear systems has become a popular research topic in the area of compressive sensing, image processing, machine learning and statistics [4, 5]. The l0-norm plays an important and crucial role for modelling the sparsity of data and selecting representative variables in optimization problems. The popularity of the l1 -relaxation is enormous due to the exact recovery property under some conditions [10] This l1-norm model has been popularly used in many areas of application. The non-convex l0 -norm based regularization has more advantages than the convex l1-norm, such as in the fields of image restoration [18], bioluminescence [19], CT [17], and MRI reconstruction [20].

Gradient-based optimization methods
Spectral gradient method
Proximal method
Spectral proximal method
Numerical experiments and discussion
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call