Laplace Matching for fast Approximate Inference in Latent Gaussian Models

7 May 2021  ·  Marius Hobbhahn, Philipp Hennig ·

Bayesian inference on non-Gaussian data is often non-analytic and requires computationally expensive approximations such as sampling or variational inference. We propose an approximate inference framework primarily designed to be computationally cheap while still achieving high approximation quality. The concept, which we call Laplace Matching, involves closed-form, approximate, bi-directional transformations between the parameter spaces of exponential families. These are constructed from Laplace approximations under custom-designed basis transformations. The mappings can then be leveraged to effectively turn a latent Gaussian distribution into an approximate conjugate prior to a rich class of observable variables. This allows us to train latent Gaussian models such as Gaussian Processes on non-Gaussian data at nearly no additional cost. The method can be thought of as a pre-processing step which can be implemented in <5 lines of code and runs in less than a second. Furthermore, Laplace Matching yields a simple way to group similar data points together, e.g. to produce inducing points for GPs. We empirically evaluate the method with experiments for four different exponential distributions, namely the Beta, Gamma, Dirichlet and inverse Wishart, showing approximation quality comparable to state-of-the-art approximate inference techniques at a drastic reduction in computational cost.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here