Stein Variational Gradient Descent for Approximate Bayesian Computation

Approximate Bayesian Computation (ABC) provides a generic framework of Bayesian inference for likelihood-free models, but sampling based posterior approximation is often time-consuming and has difficulty accessing the convergence. Stochastic variational inference forms the posterior inference to a optimization problem and enable the ABC scalable for large dataset. However, complex simulation models involved in ABC always lead to complex posteriors, which is not easy to approximate by simple parametric variational distributions. We draw upon recent advances in the implicit model of variational distribution and introduce the Stein variational gradient descent (SVGD) approach to approximate the posterior by nonparametric particles. We also find that the kernel in the SVGD algorithm helps in reducing the large variance of the gradient estimators of ABC likelihood. Moreover, energy distance is proposed as the statistics in the evaluation of ABC likelihood, which reduce the difficulty in selecting proper statistics. Simulation studies are provided to demonstrate the correctness and efficiency of our algorithm.

PDF Abstract

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here