Importance Sampling (IS) is a widely used variance reduction technique for enhancing the efficiency of Monte Carlo methods, particularly in rare-event simulation and related applications. Despite its effectiveness, the performance of IS is highly sensitive to the choice of the proposal distribution and often requires stochastic calibration. While the design and analysis of IS have been extensively studied in estimation settings, applying IS within stochastic optimization introduces a fundamental challenge: the decision variable and the importance sampling distribution are mutually dependent, creating a circular optimization structure. This interdependence complicates both convergence analysis and variance control. We consider convex stochastic optimization problems with linear constraints and propose a single-loop stochastic approximation algorithm, based on a joint variant of Nesterov's dual averaging, that jointly updates the decision variable and the importance sampling distribution, without time-scale separation or nested optimization. The method is globally convergent and achieves minimal asymptotic variance among stochastic gradient schemes, matching the performance of an oracle sampler adapted to the optimal solution.
翻译:重要性采样(IS)是一种广泛使用的方差缩减技术,用于提升蒙特卡洛方法的效率,尤其在罕见事件模拟及相关应用中效果显著。尽管IS方法有效,但其性能对建议分布的选择极为敏感,通常需要进行随机校准。虽然IS的设计与分析在估计场景中已得到广泛研究,但将IS应用于随机优化时引入了一个根本性挑战:决策变量与重要性采样分布相互依赖,形成了一个循环优化结构。这种相互依赖性使得收敛性分析和方差控制均变得复杂。我们考虑带有线性约束的凸随机优化问题,并提出一种基于Nesterov对偶平均联合变体的单循环随机逼近算法,该算法联合更新决策变量和重要性采样分布,无需时间尺度分离或嵌套优化。该方法具有全局收敛性,并在随机梯度方案中实现了最小的渐近方差,其性能与适应于最优解的预言采样器相当。