Paper

Online Matrix Completion with Side Information

We give an online algorithm and prove novel mistake and regret bounds for online binary matrix completion with side information. The mistake bounds we prove are of the form $\tilde{O}(D/\gamma^2)$. The term $1/\gamma^2$ is analogous to the usual margin term in SVM (perceptron) bounds. More specifically, if we assume that there is some factorization of the underlying $m \times n$ matrix into $P Q^\intercal$ where the rows of $P$ are interpreted as "classifiers" in $\mathcal{R}^d$ and the rows of $Q$ as "instances" in $\mathcal{R}^d$, then $\gamma$ is the maximum (normalized) margin over all factorizations $P Q^\intercal$ consistent with the observed matrix. The quasi-dimension term $D$ measures the quality of side information. In the presence of vacuous side information, $D= m+n$. However, if the side information is predictive of the underlying factorization of the matrix, then in an ideal case, $D \in O(k + \ell)$ where $k$ is the number of distinct row factors and $\ell$ is the number of distinct column factors. We additionally provide a generalization of our algorithm to the inductive setting. In this setting, we provide an example where the side information is not directly specified in advance. For this example, the quasi-dimension $D$ is now bounded by $O(k^2 + \ell^2)$.

Results in Papers With Code
(↓ scroll down to see all results)