Searching large and complex design spaces for a global optimum can be infeasible and unnecessary. A practical alternative is to iteratively refine the neighborhood of an initial design using local optimization methods such as gradient descent. We propose local entropy search (LES), a Bayesian optimization paradigm that explicitly targets the solutions reachable by the descent sequences of iterative optimizers. The algorithm propagates the posterior belief over the objective through the optimizer, resulting in a probability distribution over descent sequences. It then selects the next evaluation by maximizing mutual information with that distribution, using a combination of analytic entropy calculations and Monte-Carlo sampling of descent sequences. Empirical results on high-complexity synthetic objectives and benchmark problems show that LES achieves strong sample efficiency compared to existing local and global Bayesian optimization methods.
翻译:在庞大而复杂的设计空间中搜索全局最优解可能既不可行也无必要。一种实用的替代方案是使用梯度下降等局部优化方法,从初始设计出发迭代优化其邻域。我们提出局部熵搜索(LES),这是一种贝叶斯优化范式,其明确以迭代优化器的下降序列可抵达的解为目标。该算法通过优化器传播目标函数的后验置信度,从而得到下降序列的概率分布。随后,算法通过最大化与该分布间的互信息来选择下一次评估点,其中结合了解析熵计算与下降序列的蒙特卡洛采样。在高复杂度合成目标函数及基准问题上的实验结果表明,与现有的局部及全局贝叶斯优化方法相比,LES实现了显著的样本效率提升。