no code implementations • 6 Apr 2021 • Christopher Harshaw, Ehsan Kazemi, Moran Feldman, Amin Karbasi
We propose subsampling as a unified algorithmic technique for submodular maximization in centralized and online settings.
1 code implementation • 29 Sep 2020 • Moran Feldman, Christopher Harshaw, Amin Karbasi
We also present SubmodularGreedy. jl, a Julia package which implements these algorithms and may be downloaded at https://github. com/crharshaw/SubmodularGreedy. jl .
1 code implementation • 8 Nov 2019 • Christopher Harshaw, Fredrik Sävje, Daniel Spielman, Peng Zhang
Asymptotically, the design perfectly balances all linear functions of a growing number of covariates with a diminishing reduction in robustness, effectively allowing experimenters to escape the compromise between balance and robustness in large samples.
Methodology Data Structures and Algorithms Statistics Theory Statistics Theory
1 code implementation • 19 Apr 2019 • Christopher Harshaw, Moran Feldman, Justin Ward, Amin Karbasi
It is generally believed that submodular functions -- and the more general class of $\gamma$-weakly submodular functions -- may only be optimized under the non-negativity assumption $f(S) \geq 0$.
no code implementations • ICML 2018 • Lin Chen, Christopher Harshaw, Hamed Hassani, Amin Karbasi
We also propose One-Shot Frank-Wolfe, a simpler algorithm which requires only a single stochastic gradient estimate in each round and achieves an $O(T^{2/3})$ stochastic regret bound for convex and continuous submodular optimization.
no code implementations • 5 Apr 2017 • Moran Feldman, Christopher Harshaw, Amin Karbasi
Sample Greedy achieves $(k + 3)$-approximation with only $O(nr/k)$ function evaluations.