Mutual Information Estimation
38 papers with code • 0 benchmarks • 0 datasets
To estimate mutual information from samples, specially for high-dimensional variables.
Benchmarks
These leaderboards are used to track progress in Mutual Information Estimation
Latest papers with no code
Mutual information estimation for graph convolutional neural networks
Mutual information can be used as a measure of the quality of internal representations in deep learning models, and the information plane may provide insights into whether the model exploits the available information in the data.
A Perspective on Neural Capacity Estimation: Viability and Reliability
These estimators ar referred to as neural mutual information estimation (NMIE)s. NMIEs differ from other approaches as they are data-driven estimators.
Neural Topic Modeling with Deep Mutual Information Estimation
NTM-DMIE is a neural network method for topic learning which maximizes the mutual information between the input documents and their latent topic representation.
CNN-Aided Factor Graphs with Estimated Mutual Information Features for Seizure Detection
We then use a 1D-CNN to extract extra features from the EEG signals and use both features to estimate the probability of a seizure event.~Finally, learned factor graphs are employed to capture the temporal correlation in the signal.
Compressed Predictive Information Coding
The key insight of our framework is to learn representations by minimizing the compression complexity and maximizing the predictive information in latent space.
Toward Enhanced Robustness in Unsupervised Graph Representation Learning: A Graph Information Bottleneck Perspective
Our RGIB attempts to learn robust node representations against adversarial perturbations by preserving the original information in the benign graph while eliminating the adversarial information in the adversarial graph.
A Reverse Jensen Inequality Result with Application to Mutual Information Estimation
The Jensen inequality is a widely used tool in a multitude of fields, such as for example information theory and machine learning.
Data-Driven Representations for Testing Independence: Modeling, Analysis and Connection with Mutual Information Estimation
This work addresses testing the independence of two continuous and finite-dimensional random variables from the design of a data-driven partition.
Mutual Information Estimation as a Difference of Entropies for Unsupervised Representation Learning
In this work, we derive a principled non-contrastive method where mutual information is estimated as a difference of entropies and thus no need for negative sampling.
Learning Bias-Invariant Representation by Cross-Sample Mutual Information Minimization
We propose to remove the bias information misused by the target task with a cross-sample adversarial debiasing (CSAD) method.