Nonconvex Robust Low-rank Matrix Recovery

24 Sep 2018  ·  Xiao Li, Zhihui Zhu, Anthony Man-Cho So, Rene Vidal ·

In this paper we study the problem of recovering a low-rank matrix from a number of random linear measurements that are corrupted by outliers taking arbitrary values. We consider a nonsmooth nonconvex formulation of the problem, in which we enforce the low-rank property explicitly by using a factored representation of the matrix variable and employ an $\ell_1$-loss function to robustify the solution against outliers. Under the Gaussian measurement model, we show that even when a constant fraction (which can be up to almost half) of the information-theoretically optimal number of measurements are arbitrarily corrupted, the resulting optimization problem is sharp and weakly convex. Consequently, we show that when initialized close to the set of global minima of the problem, a SubGradient Method (SubGM) with geometrically diminishing step sizes will converge linearly to the ground-truth matrix. We demonstrate the performance of the SubGM for the nonconvex robust low-rank matrix recovery problem with various numerical experiments.

PDF Abstract
No code implementations yet. Submit your code now

Categories


Information Theory Information Theory

Datasets


  Add Datasets introduced or used in this paper