Low-Rank Matrix Completion
25 papers with code • 0 benchmarks • 0 datasets
Low-Rank Matrix Completion is an important problem with several applications in areas such as recommendation systems, sketching, and quantum tomography. The goal in matrix completion is to recover a low rank matrix, given a small number of entries of the matrix.
Source: Universal Matrix Completion
Benchmarks
These leaderboards are used to track progress in Low-Rank Matrix Completion
Latest papers
Mixed Membership Graph Clustering via Systematic Edge Query
This work aims at learning mixed membership of nodes using queried edges.
Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method
We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective.
Deep Generalization of Structured Low-Rank Algorithms (Deep-SLR)
The main challenge with this strategy is the high computational complexity of matrix completion.
Structured Low-Rank Algorithms: Theory, MR Applications, and Links to Machine Learning
In this survey, we provide a detailed review of recent advances in the recovery of continuous domain multidimensional signals from their few non-uniform (multichannel) measurements using structured low-rank matrix completion formulation.
Adaptive Matrix Completion for the Users and the Items in Tail
In this work, we show that the skewed distribution of ratings in the user-item rating matrix of real-world datasets affects the accuracy of matrix-completion-based approaches.
Provable Subspace Tracking from Missing Data and Matrix Completion
In this work, we show that a simple modification of our robust ST solution also provably solves ST-miss and robust ST-miss.
Algebraic Variety Models for High-Rank Matrix Completion
We consider a generalization of low-rank matrix completion to the case where the data belongs to an algebraic variety, i. e. each data point is a solution to a system of polynomial equations.
Riemannian stochastic variance reduced gradient algorithm with retraction and vector transport
In recent years, stochastic variance reduction algorithms have attracted considerable attention for minimizing the average of a large but finite number of loss functions.
Riemannian stochastic variance reduced gradient on Grassmann manifold
In this paper, we propose a novel Riemannian extension of the Euclidean stochastic variance reduced gradient algorithm (R-SVRG) to a compact manifold search space.
Depth Image Inpainting: Improving Low Rank Matrix Completion with Low Gradient Regularization
The proposed low gradient regularization is integrated with the low rank regularization into the low rank low gradient approach for depth image inpainting.