R 1 -PCA: Rotational Invariant L 1 -norm Principal Component Analysis for Robust Subspace Factorization

Principal component analysis (PCA) mini- mizes the sum of squared errors (L 2 -norm) and is sensitive to the presence of outliers. We propose a rotational invariant L 1 -norm PCA (R 1 -PCA). R 1 -PCA is similar to PCA in that (1) it has a unique global solution, (2) the solution are principal eigenvectors of a robust covariance matrix (re-weighted to soften the effects of outliers), (3) the solu- tion is rotational invariant. These proper- ties are not shared by the L 1 -norm PCA. A new subspace iteration algorithm is given to compute R 1 -PCA efficiently. Experiments on several real-life datasets show R 1 -PCA can effectively handle outliers. We extend R 1 - norm to K-means clustering and show that L 1 -norm K-means leads to poor results while R 1 -K-means outperforms standard K-means.

PDF
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods