Abstract

In kernel methods, the kernels are often required to be positive definitethat restricts the use of many indefinite kernels. To consider those nonpositive definite kernels, in this paper, we aim to build an indefinite kernel learning framework for kernel logistic regression (KLR). The proposed indefinite KLR (IKLR) model is analyzed in the reproducing kernel Kreĭn spaces and then becomes nonconvex. Using the positive decomposition of a nonpositive definite kernel, the derived IKLR model can be decomposed into the difference of two convex functions. Accordingly, a concave-convex procedure (CCCP) is introduced to solve the nonconvex optimization problem. Since the CCCP has to solve a subproblem in each iteration, we propose a concave-inexact-convex procedure (CCICP) algorithm with an inexact solving scheme to accelerate the solving process. Besides, we propose a stochastic variant of CCICP to efficiently obtain a proximal solution, which achieves the similar purpose with the inexact solving scheme in CCICP. The convergence analyses of the above-mentioned two variants of CCCP are conducted. By doing so, our method works effectively not only in a deterministic setting but also in a stochastic setting. Experimental results on several benchmarks suggest that the proposed IKLR model performs favorably against the standard (positive definite) KLR and other competitive indefinite learning-based algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call