Low-Rank Matrix Completion
25 papers with code • 0 benchmarks • 0 datasets
Low-Rank Matrix Completion is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix.
Source: Universal Matrix Completion
Benchmarks
These leaderboards are used to track progress in Low-Rank Matrix Completion
Latest papers
Matrix Completion with Convex Optimization and Column Subset Selection
We present two algorithms that implement our Columns Selected Matrix Completion (CSMC) method, each dedicated to a different size problem.
Linear Recursive Feature Machines provably recover low-rank matrices
A possible explanation is that common training algorithms for neural networks implicitly perform dimensionality reduction - a process called feature learning.
Efficient Compression of Overparameterized Deep Models through Low-Dimensional Learning Dynamics
We empirically evaluate the effectiveness of our compression technique on matrix recovery problems.
Teaching Arithmetic to Small Transformers
Even in the complete absence of pretraining, this approach significantly and simultaneously improves accuracy, sample complexity, and convergence speed.
Optimal Low-Rank Matrix Completion: Semidefinite Relaxations and Eigenvector Disjunctions
Low-rank matrix completion consists of computing a matrix of minimal complexity that recovers a given set of observations as accurately as possible.
Guaranteed Tensor Recovery Fused Low-rankness and Smoothness
Recent research have made significant progress by adopting two insightful tensor priors, i. e., global low-rankness (L) and local smoothness (S) across different tensor modes, which are always encoded as a sum of two separate regularization terms into the recovery models.
Generalized Nonconvex Approach for Low-Tubal-Rank Tensor Recovery
The tensor-tensor product-induced tensor nuclear norm (t-TNN) (Lu et al., 2020) minimization for low-tubal-rank tensor recovery attracts broad attention recently.
GNMR: A provable one-line algorithm for low rank matrix recovery
Low rank matrix recovery problems, including matrix completion and matrix sensing, appear in a broad range of applications.
A Scalable Second Order Method for Ill-Conditioned Matrix Completion from Few Samples
We propose an iterative algorithm for low-rank matrix completion that can be interpreted as an iteratively reweighted least squares (IRLS) algorithm, a saddle-escaping smoothing Newton method or a variable metric proximal gradient method applied to a non-convex rank surrogate.
Simulation comparisons between Bayesian and de-biased estimators in low-rank matrix completion
In this paper, we study the low-rank matrix completion problem, a class of machine learning problems, that aims at the prediction of missing entries in a partially observed matrix.