1 code implementation • 8 Mar 2023 • Zhuo Sun, Chris J. Oates, François-Xavier Briol
Control variates can be a powerful tool to reduce the variance of Monte Carlo estimators, but constructing effective control variates can be challenging when the number of samples is small.
no code implementations • 17 Mar 2022 • Toni Karvonen, Chris J. Oates
Gaussian process regression underpins countless academic and industrial applications of machine learning and statistics, with maximum likelihood estimation routinely used to select appropriate parameters for the covariance kernel.
no code implementations • 23 Dec 2020 • Jon Cockayne, Matthew M. Graham, Chris J. Oates, T. J. Sullivan
A learning procedure takes as input a dataset and performs inference for the parameters $\theta$ of a model that is assumed to have given rise to the dataset.
Bayesian Inference Statistics Theory Statistics Theory
no code implementations • 23 Dec 2020 • Jon Cockayne, Ilse C. F. Ipsen, Chris J. Oates, Tim W. Reid
This paper presents a probabilistic perspective on iterative methods for approximating the solution $\mathbf{x}_* \in \mathbb{R}^d$ of a nonsingular linear system $\mathbf{A} \mathbf{x}_* = \mathbf{b}$.
1 code implementation • 22 Oct 2020 • Matthew A. Fisher, Tui Nolan, Matthew M. Graham, Dennis Prangle, Chris J. Oates
Measure transport underpins several recent algorithms for posterior approximation in the Bayesian context, wherein a transport map is sought to minimise the Kullback--Leibler divergence (KLD) from the posterior to the approximation.
1 code implementation • 16 Oct 2020 • Takuo Matsubara, Chris J. Oates, François-Xavier Briol
Our approach constructs a prior distribution for the parameters of the network, called a ridgelet prior, that approximates the posited Gaussian process in the output space of the network.