Sparse PCA through Low-rank Approximations

3 Mar 2013  ·  Dimitris S. Papailiopoulos, Alexandros G. Dimakis, Stavros Korokythakis ·

We introduce a novel algorithm that computes the $k$-sparse principal component of a positive semidefinite matrix $A$. Our algorithm is combinatorial and operates by examining a discrete set of special vectors lying in a low-dimensional eigen-subspace of $A$. We obtain provable approximation guarantees that depend on the spectral decay profile of the matrix: the faster the eigenvalue decay, the better the quality of our approximation. For example, if the eigenvalues of $A$ follow a power-law decay, we obtain a polynomial-time approximation algorithm for any desired accuracy. A key algorithmic component of our scheme is a combinatorial feature elimination step that is provably safe and in practice significantly reduces the running complexity of our algorithm. We implement our algorithm and test it on multiple artificial and real data sets. Due to the feature elimination step, it is possible to perform sparse PCA on data sets consisting of millions of entries in a few minutes. Our experimental evaluation shows that our scheme is nearly optimal while finding very sparse vectors. We compare to the prior state of the art and show that our scheme matches or outperforms previous algorithms in all tested data sets.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods