244 papers with code ·
Methodology

Bayesian Inference is a methodology that employs Bayes Rule to estimate parameters (and their full posterior).

No evaluation results yet. Help compare methods by
submit
evaluation metrics.

Recently, casting probabilistic regression as a multi-task learning problem in terms of conditional latent variable (CLV) models such as the Neural Process (NP) has shown promising results.

Compared with conventional BNP models for arbitrary RPs, the proposed model is simpler and has a high affinity with Bayesian inference.

Standard Bayesian inference algorithms are prohibitively expensive in the regime of modern large-scale data.

Incorporating the natural document-sentence-word structure into hierarchical Bayesian modeling, we propose convolutional Poisson gamma dynamical systems (PGDS) that introduce not only word-level probabilistic convolutions, but also sentence-level stochastic temporal transitions.

A statistical analysis of the observed perturbations in the density of stellar streams can in principle set stringent contraints on the mass function of dark matter subhaloes, which in turn can be used to constrain the mass of the dark matter particle.

We present algorithms (a) for nested neural likelihood-to-evidence ratio estimation, and (b) for simulation reuse via an inhomogeneous Poisson point process cache of parameters and corresponding simulations.

The Human Similarity Judgments extension to ImageNet (ImageNet-HSJ) is composed of human similarity judgments that supplement the ILSVRC validation set.

Bayesian inference without the access of likelihood, called likelihood-free inference, is highlighted in simulation to yield a more realistic simulation result.

Mathematical models in epidemiology strive to describe the dynamics and important characteristics of infectious diseases.

This paper introduces Distributed Stein Variational Gradient Descent (DSVGD), a non-parametric generalized Bayesian inference framework for federated learning.