1 code implementation • 12 Mar 2024 • Miguel Fuentes, Brett Mullins, Ryan McKenna, Gerome Miklau, Daniel Sheldon
This technique allows for public data to be included in a graphical-model-based mechanism.
no code implementations • NeurIPS 2023 • Anastasia Koloskova, Ryan McKenna, Zachary Charles, Keith Rush, Brendan Mcmahan
We propose a simplified setting that distills key facets of these methods and isolates the impact of linearly correlated noise.
1 code implementation • NeurIPS 2021 • Ryan McKenna, Siddhant Pradhan, Daniel Sheldon, Gerome Miklau
Private-PGM is a recent approach that uses graphical models to represent the data distribution, with complexity proportional to that of exact marginal inference in a graphical model with structure determined by the co-occurrence of variables in the noisy measurements.
1 code implementation • 29 May 2019 • Satya Kuppam, Ryan McKenna, David Pujol, Michael Hay, Ashwin Machanavajjhala, Gerome Miklau
Data collected about individuals is regularly used to make decisions that impact those same individuals.
Databases
4 code implementations • 26 Jan 2019 • Ryan McKenna, Daniel Sheldon, Gerome Miklau
Many privacy mechanisms reveal high-level information about a data distribution through noisy measurements.
no code implementations • ICML 2017 • Garrett Bernstein, Ryan McKenna, Tao Sun, Daniel Sheldon, Michael Hay, Gerome Miklau
A naive learning algorithm that uses the noisy sufficient statistics “as is” outperforms general-purpose differentially private learning algorithms.
no code implementations • 14 Jun 2017 • Garrett Bernstein, Ryan McKenna, Tao Sun, Daniel Sheldon, Michael Hay, Gerome Miklau
We investigate the problem of learning discrete, undirected graphical models in a differentially private way.