Variational Inference
748 papers with code • 1 benchmarks • 5 datasets
Fitting approximate posteriors with variational inference transforms the inference problem into an optimization problem, where the goal is (typically) to optimize the evidence lower bound (ELBO) on the log likelihood of the data.
Libraries
Use these libraries to find Variational Inference models and implementationsMost implemented papers
Unsupervised Data Imputation via Variational Inference of Deep Subspaces
In this work, we introduce a general probabilistic model that describes sparse high dimensional imaging data as being generated by a deep non-linear embedding.
Sliced Score Matching: A Scalable Approach to Density and Score Estimation
However, it has been so far limited to simple, shallow models or low-dimensional data, due to the difficulty of computing the Hessian of log-density functions.
Uncertainty Estimations by Softplus normalization in Bayesian Convolutional Neural Networks with Variational Inference
On multiple datasets in supervised learning settings (MNIST, CIFAR-10, CIFAR-100), this variational inference method achieves performances equivalent to frequentist inference in identical architectures, while the two desiderata, a measure for uncertainty and regularization are incorporated naturally.
A Probabilistic Formulation of Unsupervised Text Style Transfer
Across all style transfer tasks, our approach yields substantial gains over state-of-the-art non-generative baselines, including the state-of-the-art unsupervised machine translation techniques that our approach generalizes.
Pathfinder: Parallel quasi-Newton variational inference
Pathfinder returns draws from the approximation with the lowest estimated Kullback-Leibler (KL) divergence to the true posterior.
Scalable Recommendation with Poisson Factorization
This is an efficient algorithm that iterates over the observed entries and adjusts an approximate posterior over the user/item representations.
Automatic Differentiation Variational Inference
Probabilistic modeling is iterative.
Deep Variational Bayes Filters: Unsupervised Learning of State Space Models from Raw Data
We introduce Deep Variational Bayes Filters (DVBF), a new method for unsupervised learning and identification of latent Markovian state space models.
DropMax: Adaptive Variational Softmax
Moreover, the learning of dropout rates for non-target classes on each instance allows the classifier to focus more on classification against the most confusing classes.
Probabilistic Recurrent State-Space Models
State-space models (SSMs) are a highly expressive model class for learning patterns in time series data and for system identification.