Indefinite Kernel Logistic Regression with Concave-inexact-convex Procedure

6 Jul 2017  ·  Fanghui Liu, Xiaolin Huang, Chen Gong, Jie Yang, Johan A. K. Suykens ·

In kernel methods, the kernels are often required to be positive definite, which restricts the use of many indefinite kernels. To consider those non-positive definite kernels, in this paper, we aim to build an indefinite kernel learning framework for kernel logistic regression. The proposed indefinite kernel logistic regression (IKLR) model is analysed in the Reproducing Kernel Kre\u{\i}n Spaces (RKKS) and then becomes non-convex. Using the positive decomposition of a non-positive definite kernel, the derived IKLR model can be decomposed into the difference of two convex functions. Accordingly, a concave-convex procedure is introduced to solve the non-convex optimization problem. Since the concave-convex procedure has to solve a sub-problem in each iteration, we propose a concave-inexact-convex procedure (CCICP) algorithm with an inexact solving scheme to accelerate the solving process. Besides, we propose a stochastic variant of CCICP to efficiently obtain a proximal solution, which achieves the similar purpose with the inexact solving scheme in CCICP. The convergence analyses of the above two variants of concave-convex procedure are conducted. By doing so, our method works effectively not only under a deterministic setting but also under a stochastic setting. Experimental results on several benchmarks suggest that the proposed IKLR model performs favorably against the standard (positive-definite) kernel logistic regression and other competitive indefinite learning based algorithms.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods