Sampling algorithms based on discretizations of Stochastic Differential Equations (SDEs) compose a rich and popular subset of MCMC methods. This work provides a general framework for the non-asymptotic analysis of sampling error in 2-Wasserstein distance, which also leads to a bound of mixing time. The method applies to any consistent discretization of contractive SDEs. When applied to Langevin Monte Carlo algorithm, it establishes $\tilde{\mathcal{O}}\left( \frac{\sqrt{d}}{\epsilon} \right)$ mixing time, without warm start, under the common log-smooth and log-strongly-convex conditions, plus a growth condition on the 3rd-order derivative of the potential of target measures at infinity. This bound improves the best previously known $\tilde{\mathcal{O}}\left( \frac{d}{\epsilon} \right)$ result and is optimal (in terms of order) in both dimension $d$ and accuracy tolerance $\epsilon$ for target measures satisfying the aforementioned assumptions. Our theoretical analysis is further validated by numerical experiments.
翻译:基于软体差异分解的抽样算法(SDEs)是MCMC方法中一个丰富和受欢迎的子集。 这项工作为对2- Wasserstein 距离2- Wasserstein 距离的抽样错误进行非抽取性分析提供了一个总体框架, 从而导致混合时间的结合。 该方法适用于任何一致的SDE的分解。 当应用到 Langevin Monte Carlo 算法时, 它在通用的日志mooth 和 log- strong- convex 条件下, 在通用日志和log- strong- covex 条件下, 建立了$( 按顺序条件) 的混合时间, 而没有温暖的混合时间, 以及目标计量潜力无限的3级衍生物的增长条件。 该方法改进了以前最著名的 $tilde\ math cal{O ⁇ left (\frac{d- duepsilon}\right) 的结果, 并且是最理想的(按顺序条件) 美元, 在通用的维值 $d$ 和精确容忍容忍容忍容忍 $\silon 假设下, 我们的数性假设是进一步的 。