Mutual Information Estimation
38 papers with code • 0 benchmarks • 0 datasets
To estimate mutual information from samples, specially for high-dimensional variables.
Benchmarks
These leaderboards are used to track progress in Mutual Information Estimation
Latest papers
DiME: Maximizing Mutual Information by a Difference of Matrix-Based Entropies
We introduce an information-theoretic quantity with similar properties to mutual information that can be estimated from data without making explicit assumptions on the underlying distribution.
Augmentation-Free Graph Contrastive Learning of Invariant-Discriminative Representations
iGCL designs the invariant-discriminative loss (ID loss) to learn invariant and discriminative representations.
Improving Adversarial Robustness via Mutual Information Estimation
To alleviate this negative effect, in this paper, we investigate the dependence between outputs of the target model and input adversarial samples from the perspective of information theory, and propose an adversarial defense method.
Unsupervised Domain Adaptation for Cardiac Segmentation: Towards Structure Mutual Information Maximization
This paper introduces UDA-VAE++, an unsupervised domain adaptation framework for cardiac segmentation with a compact loss function lower bound.
Graph Representation Learning via Aggregation Enhancement
Graph neural networks (GNNs) have become a powerful tool for processing graph-structured data but still face challenges in effectively aggregating and propagating information between layers, which limits their performance.
Density Ratio Estimation via Infinitesimal Classification
We then estimate the instantaneous rate of change of the bridge distributions indexed by time (the "time score") -- a quantity defined analogously to data (Stein) scores -- with a novel time score matching objective.
Featurized Density Ratio Estimation
Density ratio estimation serves as an important technique in the unsupervised machine learning toolbox.
Tight Mutual Information Estimation With Contrastive Fenchel-Legendre Optimization
Successful applications of InfoNCE and its variants have popularized the use of contrastive variational mutual information (MI) estimators in machine learning.
Multimodal Representation Learning via Maximization of Local Mutual Information
We propose and demonstrate a representation learning approach by maximizing the mutual information between local features of images and text.
MIND: Inductive Mutual Information Estimation, A Convex Maximum-Entropy Copula Approach
We propose a novel estimator of the mutual information between two ordinal vectors $x$ and $y$.