no code implementations • NeurIPS 2012 • Rodolphe Jenatton, Nicolas L. Roux, Antoine Bordes, Guillaume R. Obozinski
While there is a large body of work focused on modeling these data, few considered modeling these multiple types of relationships jointly.
no code implementations • NeurIPS 2012 • Nicolas L. Roux, Mark Schmidt, Francis R. Bach
We propose a new stochastic gradient method for optimizing the sum of a finite set of smooth functions, where the sum is strongly convex.
no code implementations • NeurIPS 2011 • Mark Schmidt, Nicolas L. Roux, Francis R. Bach
We consider the problem of optimizing the sum of a smooth convex function and a non-smooth convex function using proximal-gradient methods, where an error is present in the calculation of the gradient of the smooth term or in the proximity operator with respect to the second term.
no code implementations • NeurIPS 2007 • Nicolas L. Roux, Pierre-Antoine Manzagol, Yoshua Bengio
Guided by the goal of obtaining an optimization algorithm that is both fast and yielding good generalization, we study the descent direction maximizing the decrease in generalization error or the probability of not increasing generalization error.
no code implementations • NeurIPS 2007 • Nicolas L. Roux, Yoshua Bengio, Pascal Lamblin, Marc Joliveau, Balázs Kégl
We study the following question: is the two-dimensional structure of images a very strong prior or is it something that can be learned with a few examples of natural images?