no code implementations • 3 Jun 2022 • Yi-An Ma, Teodor Vanislavov Marinov, Tong Zhang
This paper considers the generalization performance of differentially private convex learning.
1 code implementation • NeurIPS 2019 • Raman Arora, Teodor Vanislavov Marinov
We revisit two algorithms, matrix stochastic gradient (MSG) and $\ell_2$-regularized MSG (RMSG), that are instances of stochastic gradient descent (SGD) on a convex relaxation to principal component analysis (PCA).
no code implementations • NeurIPS 2018 • Md Enayat Ullah, Poorya Mianjy, Teodor Vanislavov Marinov, Raman Arora
We study the statistical and computational aspects of kernel principal component analysis using random Fourier features and show that under mild assumptions, $O(\sqrt{n} \log n)$ features suffices to achieve $O(1/\epsilon^2)$ sample complexity.
no code implementations • ICML 2018 • Teodor Vanislavov Marinov, Poorya Mianjy, Raman Arora
We study streaming algorithms for principal component analysis (PCA) in noisy settings.