no code implementations • ICML 2020 • Cong Han Lim, Raquel Urtasun, Ersin Yumer
We show that, under certain conditions on the algorithm parameters, LayerCert provably reduces the number and size of the convex programs that one needs to solve compared to GeoCert.
1 code implementation • 12 Dec 2019 • Ching-pei Lee, Cong Han Lim, Stephen J. Wright
When applied to the distributed dual ERM problem, unlike state of the art that takes only the block-diagonal part of the Hessian, our approach is able to utilize global curvature information and is thus magnitudes faster.
no code implementations • NeurIPS 2018 • Cong Han Lim
We study a generalization of the classic isotonic regression problem where we allow separable nonconvex objective functions, focusing on the case of estimators used in robust regression.
1 code implementation • 4 Mar 2018 • Ching-pei Lee, Cong Han Lim, Stephen J. Wright
Initial computational results on convex problems demonstrate that our method significantly improves on communication cost and running time over the current state-of-the-art methods.
no code implementations • NeurIPS 2017 • Cong Han Lim, Stephen Wright
We study the norms obtained from extending the k-support norm and OWL norms to the setting in which there are overlapping groups.
no code implementations • NeurIPS 2014 • Cong Han Lim, Stephen Wright
Using a recent construction of Goemans (2010), we show that when optimizing over the convex hull of the permutation vectors (the permutahedron), we can reduce the number of variables and constraints to $\Theta(n \log n)$ in theory and $\Theta(n \log^2 n)$ in practice.