no code implementations • 22 Feb 2024 • Imad Aouali, Victor-Emmanuel Brunel, David Rohde, Anna Korba
In this framework, we propose sDM, a generic Bayesian approach designed for OPE and OPL, grounded in both algorithmic and theoretical foundations.
no code implementations • 25 May 2023 • Imad Aouali, Victor-Emmanuel Brunel, David Rohde, Anna Korba
In particular, it is also valid for standard IPS without making the assumption that the importance weights are bounded.
no code implementations • 19 Oct 2020 • Nicolas Schreuder, Victor-Emmanuel Brunel, Arnak Dalalyan
In this paper, we introduce a convenient framework for studying (adversarial) generative models from a statistical perspective.
2 code implementations • ICLR 2021 • Mike Gartrell, Insu Han, Elvis Dohmatob, Jennifer Gillenwater, Victor-Emmanuel Brunel
Determinantal point processes (DPPs) have attracted significant attention in machine learning for their ability to model subsets drawn from a large item collection.
no code implementations • 19 Feb 2020 • Victor-Emmanuel Brunel, Marco Avella-Medina
We derive concentration inequalities for differentially private median and mean estimators building on the "Propose, Test, Release" (PTR) mechanism introduced by Dwork and Lei (2009).
no code implementations • 27 Jun 2019 • Marco Avella-Medina, Victor-Emmanuel Brunel
We tackle the problem of estimating a location parameter with differential privacy guarantees and sub-Gaussian deviations.
1 code implementation • NeurIPS 2019 • Mike Gartrell, Victor-Emmanuel Brunel, Elvis Dohmatob, Syrine Krichene
Our method imposes a particular decomposition of the nonsymmetric kernel that enables such tractable learning algorithms, which we analyze both theoretically and experimentally.
no code implementations • 15 Mar 2019 • Victor-Emmanuel Brunel, Arnak S. Dalalyan, Nicolas Schreuder
M-estimators are ubiquitous in machine learning and statistical learning theory.
no code implementations • NeurIPS 2018 • Victor-Emmanuel Brunel
Symmetric determinantal point processes (DPP) are a class of probabilistic models that encode the random selection of items that have a repulsive behavior.
no code implementations • 26 Feb 2018 • Jason Altschuler, Victor-Emmanuel Brunel, Alan Malek
Specifically, we propose a variant of the Best Arm Identification problem for \emph{contaminated bandits}, where each arm pull has probability $\varepsilon$ of generating a sample from an arbitrary contamination distribution instead of the true underlying distribution.
no code implementations • ICML 2017 • John Urschel, Victor-Emmanuel Brunel, Ankur Moitra, Philippe Rigollet
Determinantal Point Processes (DPPs) are a family of probabilistic models that have a repulsive behavior, and lend themselves naturally to many tasks in machine learning where returning a diverse set of objects is important.