no code implementations • 30 Aug 2022 • Zikai Xiong, Robert M. Freund
The Frank-Wolfe method has become increasingly useful in statistical and machine learning applications, due to the structure-inducing properties of the iterates, and especially in settings where linear minimization over the feasible set is more computationally efficient than projection.
1 code implementation • ICML 2020 • Geoffrey Négiar, Gideon Dresdner, Alicia Tsai, Laurent El Ghaoui, Francesco Locatello, Robert M. Freund, Fabian Pedregosa
We propose a novel Stochastic Frank-Wolfe (a. k. a.
no code implementations • 20 Oct 2018 • Robert M. Freund, Paul Grigas, Rahul Mazumder
When the training data is non-separable, we show that the degree of non-separability naturally enters the analysis and informs the properties and convergence guarantees of two standard first-order methods: steepest descent (for any given norm) and stochastic gradient descent.
no code implementations • 6 Nov 2015 • Robert M. Freund, Paul Grigas, Rahul Mazumder
Motivated principally by the low-rank matrix completion problem, we present an extension of the Frank-Wolfe method that is designed to induce near-optimal solutions on low-dimensional faces of the feasible region.
no code implementations • 16 May 2015 • Robert M. Freund, Paul Grigas, Rahul Mazumder
Furthermore, we show that these new algorithms for the Lasso may also be interpreted as the same master algorithm (subgradient descent), applied to a regularized version of the maximum absolute correlation loss function.
no code implementations • 4 Jul 2013 • Robert M. Freund, Paul Grigas, Rahul Mazumder
Boosting methods are highly popular and effective supervised learning methods which combine weak learners into a single accurate model with good statistical performance.