Abstract

When solving many regression problems, there exist a large number of input features. However, not all features are relevant for current regression, and sometimes, including irrelevant features may deteriorate the learning performance. Therefore, it is essential to select the most relevant features, especially for high-dimensional regression. Feature selection is an effective way to solve this problem. It tries to represent original data by extracting relevant features that contain useful information. In this paper, aiming to effectively select useful features in least squares support vector regression (LSSVR), we propose a novel sparse LSSVR based on L p -norm (SLSSVR), 0 < p ≤ 1 . Different from the existing L 1 -norm LSSVR ( L 1 -LSSVR) and L p -norm LSSVR ( L p -LSSVR), SLSSVR uses a smooth approximation of the nonsmooth nonconvex L p -norm term along with an effective solving algorithm. The proposed algorithm avoids the singularity issue that may encounter in L p -LSSVR, and its convergency is also guaranteed. Experimental results support the effectiveness of SLSSVR on both feature selection ability and regression performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call