Recently, there has been a lot of focus on penalized least squares problems for noisy sparse signal estimation. The penalty induces sparsity and a very common choice has been the convex l1 norm. However, to improve sparsity and reduce the biases associated with the l1 norm, one must move to non-convex penalties such as the lq norm . In this paper we present a novel cyclic descent algorithm for optimizing the resulting lq penalized least squares problem. Optimality conditions for this problem are derived and competing ones clarified. Coordinate-wise convergence as well as convergence to a local minimizer of the algorithm, which is highly non-trivial, is proved and we illustrate with simulations comparing the signal reconstruction quality with three penalty functions: l0, l1 and lq with 0 <; q <; 1.