Mean-Square Analysis with An Application to Optimal Dimension Dependence of Langevin Monte Carlo

NeurIPS 2021  ·  Ruilin Li, Hongyuan Zha, Molei Tao ·

Sampling algorithms based on discretizations of Stochastic Differential Equations (SDEs) compose a rich and popular subset of MCMC methods. This work provides a general framework for the non-asymptotic analysis of sampling error in 2-Wasserstein distance, which also leads to a bound of mixing time. The method applies to any consistent discretization of contractive SDEs. When applied to Langevin Monte Carlo algorithm, it establishes $\widetilde{\mathcal{O}}\left(\frac{\sqrt{d}}{\epsilon}\right)$ mixing time, without warm start, under the common log-smooth and log-strongly-convex conditions, plus a growth condition on the potential of target measures at infinity. This bound improves the best previously known $\widetilde{\mathcal{O}}\left(\frac{d}{\epsilon}\right)$ result and is optimal in both dimension $d$ and accuracy tolerance $\epsilon$ for log-smooth and log-strongly-convex target measures. Our theoretical analysis is further validated by numerical experiments.

PDF Abstract
No code implementations yet. Submit your code now

Tasks


Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here