Abstract

In this paper, inspired by sparse principal component analysis (SPCA) via the elastic net regularization, we propose a new criterion for sparsification of the kernel principal component analysis (KPCA) with the elastic net regularization that can simultaneously consider the data approximation and sparsification. We first show that KPCA also can be relaxed into a regression framework optimization problem, with a quadratic penalty; l 1 -norm can then be integrated into the regression criterion, leading to a new cost function. The minimization is iteratively conducted together with alternating direction method of multipliers. Experimental results for toy examples and real world data support the analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call