Bayesian Inference
623 papers with code • 1 benchmarks • 7 datasets
Bayesian Inference is a methodology that employs Bayes Rule to estimate parameters (and their full posterior).
Libraries
Use these libraries to find Bayesian Inference models and implementationsDatasets
Most implemented papers
Inferring COVID-19 spreading rates and potential change points for case number forecasts
As COVID-19 is rapidly spreading across the globe, short-term modeling forecasts provide time-critical information for decisions on containment and mitigation strategies.
Deep Neural Networks as Gaussian Processes
As such, previous work has not identified that these kernels can be used as covariance functions for GPs and allow fully Bayesian prediction with a deep neural network.
MultiNest: an efficient and robust Bayesian inference tool for cosmology and particle physics
We present further development and the first public release of our multimodal nested sampling algorithm, called MultiNest.
A Comprehensive guide to Bayesian Convolutional Neural Network with Variational Inference
In this paper, Bayesian Convolutional Neural Network (BayesCNN) using Variational Inference is proposed, that introduces probability distribution over the weights.
TabPFN: A Transformer That Solves Small Tabular Classification Problems in a Second
We present TabPFN, a trained Transformer that can do supervised classification for small tabular datasets in less than a second, needs no hyperparameter tuning and is competitive with state-of-the-art classification methods.
Stochastic Backpropagation and Approximate Inference in Deep Generative Models
We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning.
Uncertainty Estimations by Softplus normalization in Bayesian Convolutional Neural Networks with Variational Inference
On multiple datasets in supervised learning settings (MNIST, CIFAR-10, CIFAR-100), this variational inference method achieves performances equivalent to frequentist inference in identical architectures, while the two desiderata, a measure for uncertainty and regularization are incorporated naturally.
Neural Clustering Processes
Probabilistic clustering models (or equivalently, mixture models) are basic building blocks in countless statistical models and involve latent random variables over discrete spaces.
Variational Bayesian Monte Carlo
We introduce here a novel sample-efficient inference framework, Variational Bayesian Monte Carlo (VBMC).
Cyclical Stochastic Gradient MCMC for Bayesian Deep Learning
The posteriors over neural network weights are high dimensional and multimodal.