Density Estimation
416 papers with code • 14 benchmarks • 14 datasets
The goal of Density Estimation is to give an accurate description of the underlying probabilistic density distribution of an observable data set with unknown density.
Source: Contrastive Predictive Coding Based Feature for Automatic Speaker Verification
Libraries
Use these libraries to find Density Estimation models and implementationsDatasets
Most implemented papers
PointConv: Deep Convolutional Networks on 3D Point Clouds
Besides, our experiments converting CIFAR-10 into a point cloud showed that networks built on PointConv can match the performance of convolutional networks in 2D images of a similar structure.
Neural Spline Flows
A normalizing flow models a complex probability density as an invertible transformation of a simple base density.
PixelCNN++: Improving the PixelCNN with Discretized Logistic Mixture Likelihood and Other Modifications
1) We use a discretized logistic mixture likelihood on the pixels, rather than a 256-way softmax, which we find to speed up training.
FFJORD: Free-form Continuous Dynamics for Scalable Reversible Generative Models
The result is a continuous-time invertible generative model with unbiased density estimation and one-pass sampling, while allowing unrestricted neural network architectures.
PixelSNAIL: An Improved Autoregressive Generative Model
Autoregressive generative models consistently achieve the best results in density estimation tasks involving high dimensional data, such as images or audio.
It's Raw! Audio Generation with State-Space Models
SaShiMi yields state-of-the-art performance for unconditional waveform generation in the autoregressive setting.
Representation Learning: A Review and New Perspectives
The success of machine learning algorithms generally depends on data representation, and we hypothesize that this is because different representations can entangle and hide more or less the different explanatory factors of variation behind the data.
The Concrete Distribution: A Continuous Relaxation of Discrete Random Variables
The essence of the trick is to refactor each stochastic node into a differentiable function of its parameters and a random variable with fixed distribution.
Neural Autoregressive Flows
Normalizing flows and autoregressive models have been successfully combined to produce state-of-the-art results in density estimation, via Masked Autoregressive Flows (MAF), and to accelerate state-of-the-art WaveNet-based speech synthesis to 20x faster than real-time, via Inverse Autoregressive Flows (IAF).
Invertible Residual Networks
We show that standard ResNet architectures can be made invertible, allowing the same model to be used for classification, density estimation, and generation.