no code implementations • 23 Dec 2020 • Jon Cockayne, Matthew M. Graham, Chris J. Oates, T. J. Sullivan
A learning procedure takes as input a dataset and performs inference for the parameters $\theta$ of a model that is assumed to have given rise to the dataset.
Bayesian Inference Statistics Theory Statistics Theory
1 code implementation • 22 Oct 2020 • Matthew A. Fisher, Tui Nolan, Matthew M. Graham, Dennis Prangle, Chris J. Oates
Measure transport underpins several recent algorithms for posterior approximation in the Bayesian context, wherein a transport map is sought to minimise the Kullback--Leibler divergence (KLD) from the posterior to the approximation.
1 code implementation • 9 Mar 2020 • Khai Xiang Au, Matthew M. Graham, Alexandre H. Thiery
Standard Markov chain Monte Carlo methods struggle to explore distributions that are concentrated in the neighbourhood of low-dimensional structures.
Computation
1 code implementation • 6 Dec 2019 • Matthew M. Graham, Alexandre H. Thiery, Alexandros Beskos
Bayesian inference for nonlinear diffusions, observed at discrete times, is a challenging task that has prompted the development of a number of algorithms, mainly within the computational statistics community.
Computation Methodology 65C40 (Primary) 65C05, 65C40 (Secondary) G.3
2 code implementations • 11 Apr 2017 • Matthew M. Graham, Amos J. Storkey
Hamiltonian Monte Carlo (HMC) is a powerful Markov chain Monte Carlo (MCMC) method for performing approximate inference in complex probabilistic models of continuous variables.
Computation
1 code implementation • 25 May 2016 • Matthew M. Graham, Amos J. Storkey
We use the intuition that inference corresponds to integrating a density across the manifold corresponding to the set of inputs consistent with the observed outputs.