Abstract

The paper proposes an efficient method for solving a one- norm equality constrained optimization problem. In fact, this kind of optimization problems is nonconvex. First, the problem is formulated as the least absolute shrinkage and selection operator (LASSO) optimization problem. Then, it is solved by iterative shrinkage algorithms such as the fast iterative shrinkage thresholding algorithm. Next, the solution of the LASSO optimization problem is employed for formulating the constraint of the corresponding least-squares constrained optimization problem. The solution of the least-squares constrained optimization problem is taken as a near globally optimal solution of the one-norm equality constrained optimization problem. The main advantage of this proposed method is that a solution with both lower one-norm constraint error and two-norm reconstruction error can be obtained compared to those of the LASSO problem, while the required computational power is significantly reduced compared to the full search approach. Computer numerical simulation results are illustrated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call