Abstract

This paper addresses the analog optimization for non-differential functions. The Lagrange programming neural network (LPNN) approach provides us a systematic way to build analog neural networks for handling constrained optimization problems. However, its drawback is that it cannot handle non-differentiable functions. In compressive sampling, one of the optimization problems is least absolute shrinkage and selection operator (LASSO), where the constraint is non-differentiable. This paper considers the hidden state concept from the local competition algorithm to formulate an analog model for the LASSO problem. Hence, the non-differentiable limitation of LPNN can be overcome. Under some conditions, at equilibrium, the network leads to the optimal solution of the LASSO. Also, we prove that these equilibrium points are stable. Simulation study illustrates that the proposed analog model and the traditional digital method have the similar mean squared performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call