Low-Rank Matrix Completion
25 papers with code • 0 benchmarks • 0 datasets
Low-Rank Matrix Completion is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix.
Source: Universal Matrix Completion
Benchmarks
These leaderboards are used to track progress in Low-Rank Matrix Completion
Latest papers with no code
Spectal Harmonics: Bridging Spectral Embedding and Matrix Completion in Self-Supervised Learning
Self-supervised methods received tremendous attention thanks to their seemingly heuristic approach to learning representations that respect the semantics of the data without any apparent supervision in the form of labels.
A Majorization-Minimization Gauss-Newton Method for 1-Bit Matrix Completion
In 1-bit matrix completion, the aim is to estimate an underlying low-rank matrix from a partial set of binary observations.
Low Rank Matrix Completion via Robust Alternating Minimization in Nearly Linear Time
Moreover, our algorithm runs in time $\widetilde O(|\Omega| k)$, which is nearly linear in the time to verify the solution while preserving the sample complexity.
Learning Transition Operators From Sparse Space-Time Samples
This Spatio-Temporal Transition Operator Recovery problem is motivated by the recent interest in learning time-varying graph signals that are driven by graph operators depending on the underlying graph topology.
Low-Rank Covariance Completion for Graph Quilting with Applications to Functional Connectivity
This leads to the Graph Quilting problem, as first introduced by (Vinci et. al.
Online Low Rank Matrix Completion
In each round, the algorithm recommends one item per user, for which it gets a (noisy) reward sampled from a low-rank user-item preference matrix.
Introducing the Huber mechanism for differentially private low-rank matrix completion
We also propose using the Iteratively Re-Weighted Least Squares algorithm to complete low-rank matrices and study the performance of different noise mechanisms in both synthetic and real datasets.
Robust Matrix Completion with Heavy-tailed Noise
This paper studies low-rank matrix completion in the presence of heavy-tailed and possibly asymmetric noise, where we aim to estimate an underlying low-rank matrix given a set of highly incomplete noisy entries.
Bayesian Low-rank Matrix Completion with Dual-graph Embedding: Prior Analysis and Tuning-free Inference
Recently, there is a revival of interest in low-rank matrix completion-based unsupervised learning through the lens of dual-graph regularization, which has significantly improved the performance of multidisciplinary machine learning tasks such as recommendation systems, genotype imputation and image inpainting.
Adaptive Noisy Matrix Completion
In this paper we focus on adaptive matrix completion with bounded type of noise.