Search Results for author: Ziv Goldfeld

Found 18 papers, 5 papers with code

Information-Theoretic Generalization Bounds for Deep Neural Networks

no code implementations4 Apr 2024 Haiyun He, Christina Lee Yu, Ziv Goldfeld

This enables refining our generalization bounds to capture the contraction as a function of the network architecture parameters.

Generalization Bounds

Quantum Neural Estimation of Entropies

no code implementations3 Jul 2023 Ziv Goldfeld, Dhrumil Patel, Sreejith Sreekumar, Mark M. Wilde

Entropy measures quantify the amount of information and correlation present in a quantum system.

Quantum Pufferfish Privacy: A Flexible Privacy Framework for Quantum Systems

no code implementations22 Jun 2023 Theshani Nuradha, Ziv Goldfeld, Mark M. Wilde

We propose a versatile privacy framework for quantum systems, termed quantum pufferfish privacy (QPP).

Fairness

Robust Estimation under the Wasserstein Distance

1 code implementation2 Feb 2023 Sloan Nietert, Rachel Cummings, Ziv Goldfeld

We study the problem of robust distribution estimation under the Wasserstein metric, a popular discrepancy measure between probability distributions rooted in optimal transport (OT) theory.

Data-Driven Optimization of Directed Information over Discrete Alphabets

1 code implementation2 Jan 2023 Dor Tsur, Ziv Aharoni, Ziv Goldfeld, Haim Permuter

Directed information (DI) is a fundamental measure for the study and analysis of sequential stochastic models.

Statistical, Robustness, and Computational Guarantees for Sliced Wasserstein Distances

1 code implementation17 Oct 2022 Sloan Nietert, Ritwik Sadhu, Ziv Goldfeld, Kengo Kato

The goal of this work is to quantify this scalability from three key aspects: (i) empirical convergence rates; (ii) robustness to data contamination; and (iii) efficient computational methods.

Numerical Integration

k-Sliced Mutual Information: A Quantitative Study of Scalability with Dimension

no code implementations17 Jun 2022 Ziv Goldfeld, Kristjan Greenewald, Theshani Nuradha, Galen Reeves

However, a quantitative characterization of how SMI itself and estimation rates thereof depend on the ambient dimension, which is crucial to the understanding of scalability, remain obscure.

Outlier-Robust Optimal Transport: Duality, Structure, and Statistical Analysis

1 code implementation2 Nov 2021 Sloan Nietert, Rachel Cummings, Ziv Goldfeld

The Wasserstein distance, rooted in optimal transport (OT) theory, is a popular discrepancy measure between probability distributions with various applications to statistics and machine learning.

Neural Estimation of Statistical Divergences

no code implementations7 Oct 2021 Sreejith Sreekumar, Ziv Goldfeld

Statistical divergences (SDs), which quantify the dissimilarity between probability distributions, are a basic constituent of statistical inference and machine learning.

Limit Distribution Theory for the Smooth 1-Wasserstein Distance with Applications

no code implementations28 Jul 2021 Ritwik Sadhu, Ziv Goldfeld, Kengo Kato

This result is then used to derive new empirical convergence rates for classic $W_1$ in terms of the intrinsic dimension.

Two-sample testing

Non-Asymptotic Performance Guarantees for Neural Estimation of $\mathsf{f}$-Divergences

no code implementations11 Mar 2021 Sreejith Sreekumar, Zhengxin Zhang, Ziv Goldfeld

Statistical distances (SDs), which quantify the dissimilarity between probability distributions, are central to machine learning and statistics.

Smooth $p$-Wasserstein Distance: Structure, Empirical Approximation, and Statistical Applications

no code implementations11 Jan 2021 Sloan Nietert, Ziv Goldfeld, Kengo Kato

Discrepancy measures between probability distributions, often termed statistical distances, are ubiquitous in probability theory, statistics and machine learning.

Two-sample testing

The Information Bottleneck Problem and Its Applications in Machine Learning

no code implementations30 Apr 2020 Ziv Goldfeld, Yury Polyanskiy

The information bottleneck (IB) theory recently emerged as a bold information-theoretic paradigm for analyzing DL systems.

BIG-bench Machine Learning Dimensionality Reduction

Capacity of Continuous Channels with Memory via Directed Information Neural Estimator

1 code implementation9 Mar 2020 Ziv Aharoni, Dor Tsur, Ziv Goldfeld, Haim Henry Permuter

When no analytic solution is present or the channel model is unknown, there is no unified framework for calculating or even approximating capacity.

Capacity Estimation

Estimating Information Flow in Deep Neural Networks

no code implementations12 Oct 2018 Ziv Goldfeld, Ewout van den Berg, Kristjan Greenewald, Igor Melnyk, Nam Nguyen, Brian Kingsbury, Yury Polyanskiy

We then develop a rigorous estimator for $I(X;T)$ in noisy DNNs and observe compression in various models.

Clustering

Information Storage in the Stochastic Ising Model

no code implementations8 May 2018 Ziv Goldfeld, Guy Bresler, Yury Polyanskiy

We first show that at zero temperature, order of $\sqrt{n}$ bits can be stored in the system indefinitely by coding over stable, striped configurations.

Information Theory Statistical Mechanics Information Theory

Cannot find the paper you are looking for? You can Submit a new open access paper.