Search Results for author: Laurent Massoulié

Found 23 papers, 4 papers with code

Asynchronous SGD on Graphs: a Unified Framework for Asynchronous Decentralized and Federated Optimization

no code implementations1 Nov 2023 Mathieu Even, Anastasia Koloskova, Laurent Massoulié

Decentralized and asynchronous communications are two popular techniques to speedup communication complexity of distributed machine learning, by respectively removing the dependency over a central orchestrator and the need for synchronization.

Generalization Error of First-Order Methods for Statistical Learning with Generic Oracles

no code implementations10 Jul 2023 Kevin Scaman, Mathieu Even, Laurent Massoulié

In this paper, we provide a novel framework for the analysis of generalization error of first-order optimization algorithms for statistical learning when the gradient can only be accessed through partial observations given by an oracle.

Quantization Transfer Learning

Statistical limits of correlation detection in trees

no code implementations27 Sep 2022 Luca Ganassali, Laurent Massoulié, Guilhem Semerjian

In this paper we address the problem of testing whether two observed trees $(t, t')$ are sampled either independently or from a joint distribution under which they are correlated.

Muffliato: Peer-to-Peer Privacy Amplification for Decentralized Optimization and Averaging

1 code implementation10 Jun 2022 Edwige Cyffers, Mathieu Even, Aurélien Bellet, Laurent Massoulié

In this work, we introduce pairwise network differential privacy, a relaxation of LDP that captures the fact that the privacy leakage from a node $u$ to a node $v$ may depend on their relative position in the graph.

Graph Matching

Correlation detection in trees for planted graph alignment

1 code implementation15 Jul 2021 Luca Ganassali, Laurent Massoulié, Marc Lelarge

We then conjecture that graph alignment is not feasible in polynomial time when the associated tree detection problem is impossible.

A Continuized View on Nesterov Acceleration for Stochastic Gradient Descent and Randomized Gossip

1 code implementation10 Jun 2021 Mathieu Even, Raphaël Berthier, Francis Bach, Nicolas Flammarion, Pierre Gaillard, Hadrien Hendrikx, Laurent Massoulié, Adrien Taylor

We introduce the continuized Nesterov acceleration, a close variant of Nesterov acceleration whose variables are indexed by a continuous time parameter.

Impossibility of Partial Recovery in the Graph Alignment Problem

no code implementations4 Feb 2021 Luca Ganassali, Laurent Massoulié, Marc Lelarge

Random graph alignment refers to recovering the underlying vertex correspondence between two random graphs with correlated edges.

Concentration of Non-Isotropic Random Tensors with Applications to Learning and Empirical Risk Minimization

no code implementations4 Feb 2021 Mathieu Even, Laurent Massoulié

Dimension is an inherent bottleneck to some modern learning tasks, where optimization methods suffer from the size of the data.

Distributed Optimization

Partial Recovery in the Graph Alignment Problem

no code implementations1 Jul 2020 Georgina Hall, Laurent Massoulié

Our focus here is on partial recovery, i. e., we look for a one-to-one mapping which is correct on a fraction of the nodes of the graph rather than on all of them, and we assume that the two input graphs to the problem are correlated Erd\H{o}s-R\'enyi graphs of parameters $(n, q, s)$.

Asynchronous Accelerated Proximal Stochastic Gradient for Strongly Convex Distributed Finite Sums

no code implementations28 Jan 2019 Hadrien Hendrikx, Francis Bach, Laurent Massoulié

In this work, we study the problem of minimizing the sum of strongly convex functions split over a network of $n$ nodes.

Optimization and Control Distributed, Parallel, and Cluster Computing

Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives

no code implementations5 Oct 2018 Hadrien Hendrikx, Francis Bach, Laurent Massoulié

Applying $ESDACD$ to quadratic local functions leads to an accelerated randomized gossip algorithm of rate $O( \sqrt{\theta_{\rm gossip}/n})$ where $\theta_{\rm gossip}$ is the rate of the standard randomized gossip.

Efficient inference in stochastic block models with vertex labels

no code implementations20 Jun 2018 Clara Stegehuis, Laurent Massoulié

Whenever this function has multiple fixed points, the belief propagation algorithm may not perform optimally.

Stochastic Block Model

Optimal Algorithms for Non-Smooth Distributed Optimization in Networks

no code implementations NeurIPS 2018 Kevin Scaman, Francis Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié

Under the global regularity assumption, we provide a simple yet efficient algorithm called distributed randomized smoothing (DRS) based on a local smoothing of the objective function, and show that DRS is within a $d^{1/4}$ multiplicative factor of the optimal convergence rate, where $d$ is the underlying dimension.

Optimization and Control

Optimal algorithms for smooth and strongly convex distributed optimization in networks

1 code implementation ICML 2017 Kevin Scaman, Francis Bach, Sébastien Bubeck, Yin Tat Lee, Laurent Massoulié

For centralized (i. e. master/slave) algorithms, we show that distributing Nesterov's accelerated gradient descent is optimal and achieves a precision $\varepsilon > 0$ in time $O(\sqrt{\kappa_g}(1+\Delta\tau)\ln(1/\varepsilon))$, where $\kappa_g$ is the condition number of the (global) function to optimize, $\Delta$ is the diameter of the network, and $\tau$ (resp.

Distributed Optimization regression

Non-Backtracking Spectrum of Degree-Corrected Stochastic Block Models

no code implementations8 Sep 2016 Lennart Gulikers, Marc Lelarge, Laurent Massoulié

As a result, a clustering positively-correlated with the true communities can be obtained based on the second eigenvector of $B$ in the regime where $\mu_2^2 > \rho.$ In a previous work we obtained that detection is impossible when $\mu_2^2 < \rho,$ meaning that there occurs a phase-transition in the sparse regime of the Degree-Corrected Stochastic Block Model.

Clustering Community Detection +1

An Impossibility Result for Reconstruction in a Degree-Corrected Planted-Partition Model

no code implementations2 Nov 2015 Lennart Gulikers, Marc Lelarge, Laurent Massoulié

We consider the Degree-Corrected Stochastic Block Model (DC-SBM): a random graph on $n$ nodes, having i. i. d.

Stochastic Block Model

Clustering and Inference From Pairwise Comparisons

no code implementations16 Feb 2015 Rui Wu, Jiaming Xu, R. Srikant, Laurent Massoulié, Marc Lelarge, Bruce Hajek

We propose an efficient algorithm that accurately estimates the individual preferences for almost all users, if there are $r \max \{m, n\}\log m \log^2 n$ pairwise comparisons per type, which is near optimal in sample complexity when $r$ only grows logarithmically with $m$ or $n$.

Clustering

Reconstruction in the Labeled Stochastic Block Model

no code implementations11 Feb 2015 Marc Lelarge, Laurent Massoulié, Jiaming Xu

The labeled stochastic block model is a random graph model representing networks with community structure and interactions of multiple types.

Stochastic Block Model Two-sample testing

The Price of Privacy in Untrusted Recommendation Engines

no code implementations13 Jul 2012 Siddhartha Banerjee, Nidhi Hegde, Laurent Massoulié

In the information-rich regime, where each user rates at least a constant fraction of items, a spectral clustering approach is shown to achieve a sample-complexity lower bound derived from a simple information-theoretic argument based on Fano's inequality.

Clustering Recommendation Systems

Distributed User Profiling via Spectral Methods

no code implementations15 Sep 2011 Dan-Cristian Tomozei, Laurent Massoulié

We prove that without prior knowledge of the compositions of the classes, based solely on few random observed ratings (namely $O(N\log N)$ such ratings for $N$ users), we can predict user preference with high probability for unrated items by running a local vote among users with similar profile vectors.

Cannot find the paper you are looking for? You can Submit a new open access paper.