This paper addresses the challenging computational problem of estimating intractable expectations over discrete domains. Existing approaches, including Monte Carlo and Russian Roulette estimators, are consistent but often require a large number of samples to achieve accurate results. We propose a novel estimator, \emph{BayesSum}, which is an extension of Bayesian quadrature to discrete domains. It is more sample efficient than alternatives due to its ability to make use of prior information about the integrand through a Gaussian process. We show this through theory, deriving a convergence rate significantly faster than Monte Carlo in a broad range of settings. We also demonstrate empirically that our proposed method does indeed require fewer samples on several synthetic settings as well as for parameter estimation for Conway-Maxwell-Poisson and Potts models.
翻译:本文针对离散域上难以计算的期望估计这一具有挑战性的计算问题展开研究。现有方法,包括蒙特卡洛和俄罗斯轮盘赌估计器,虽然具有一致性,但通常需要大量样本才能获得精确结果。我们提出一种新颖的估计器——BayesSum,它是贝叶斯求积法在离散域上的扩展。该方法通过高斯过程利用被积函数的先验信息,从而比替代方法具有更高的样本效率。我们从理论上证明了这一点,推导出其在广泛设定下比蒙特卡洛方法显著更快的收敛速率。我们还通过实验证明,在多个合成设定以及Conway-Maxwell-Poisson模型和Potts模型的参数估计中,所提出的方法确实需要更少的样本。