Online Matrix Completion with Side Information

NeurIPS 2020  ·  Mark Herbster, Stephen Pasteris, Lisa Tse ·

We give an online algorithm and prove novel mistake and regret bounds for online binary matrix completion with side information. The mistake bounds we prove are of the form $\tilde{O}(D/\gamma^2)$. The term $1/\gamma^2$ is analogous to the usual margin term in SVM (perceptron) bounds. More specifically, if we assume that there is some factorization of the underlying $m \times n$ matrix into $P Q^\intercal$ where the rows of $P$ are interpreted as "classifiers" in $\mathcal{R}^d$ and the rows of $Q$ as "instances" in $\mathcal{R}^d$, then $\gamma$ is the maximum (normalized) margin over all factorizations $P Q^\intercal$ consistent with the observed matrix. The quasi-dimension term $D$ measures the quality of side information. In the presence of vacuous side information, $D= m+n$. However, if the side information is predictive of the underlying factorization of the matrix, then in an ideal case, $D \in O(k + \ell)$ where $k$ is the number of distinct row factors and $\ell$ is the number of distinct column factors. We additionally provide a generalization of our algorithm to the inductive setting. In this setting, we provide an example where the side information is not directly specified in advance. For this example, the quasi-dimension $D$ is now bounded by $O(k^2 + \ell^2)$.

PDF Abstract NeurIPS 2020 PDF NeurIPS 2020 Abstract
No code implementations yet. Submit your code now

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods