We consider stochastic approximations of sampling algorithms, such as Stochastic Gradient Langevin Dynamics (SGLD) and the Random Batch Method (RBM) for Interacting Particle Dynamcs (IPD). We observe that the noise introduced by the stochastic approximation is nearly Gaussian due to the Central Limit Theorem (CLT) while the driving Brownian motion is exactly Gaussian. We harness this structure to absorb the stochastic approximation error inside the diffusion process, and obtain improved convergence guarantees for these algorithms. For SGLD, we prove the first stable convergence rate in KL divergence without requiring uniform warm start, assuming the target density satisfies a Log-Sobolev Inequality. Our result implies superior first-order oracle complexity compared to prior works, under significantly milder assumptions. We also prove the first guarantees for SGLD under even weaker conditions such as H\"{o}lder smoothness and Poincare Inequality, thus bridging the gap between the state-of-the-art guarantees for LMC and SGLD. Our analysis motivates a new algorithm called covariance correction, which corrects for the additional noise introduced by the stochastic approximation by rescaling the strength of the diffusion. Finally, we apply our techniques to analyze RBM, and significantly improve upon the guarantees in prior works (such as removing exponential dependence on horizon), under minimal assumptions.
翻译:我们考虑采样算法的随机近似,如基于随机梯度的 Langevin 动力学 (SGLD) 和相互作用粒子动力学 (IPD) 的随机批处理方法 (RBM)。我们观察到,由于中心极限定理 (CLT),随机近似引入的噪声几乎是高斯的,而驱动布朗运动则是完全高斯的。我们利用这个结构来吸收扩散过程中的随机近似误差,并为这些算法获得改进的收敛保证。对于 SGLD,我们在不需要统一的热启动的情况下证明了 KL 散度的第一个稳定收敛速率,假设目标密度满足对数伯努利不等式。我们的结果意味着,在明显更温和的假设下,与之前的工作相比,具有更优越的一阶预言机复杂度。我们还证明了 SGLD 的第一个保证,即在更弱的条件下(例如,H\"{o}lder 平滑和 Poincare 不等式),因此弥合了 LMC 和 SGLD 的最新保证之间的差距。我们的分析促使了一个新算法,称为协方差校正,该算法通过重新缩放扩散的强度来校正随机近似引入的额外噪声。最后,我们应用我们的技术来分析 RBM,并在最小的假设下极大地改进了之前的保证(例如消除了对时间跨度的指数依赖)。