High dimensional black-box optimization has broad applications but remains a challenging problem to solve. Given a set of samples $\{\vx_i, y_i\}$, building a global model (like Bayesian Optimization (BO)) suffers from the curse of dimensionality in the high-dimensional search space, while a greedy search may lead to sub-optimality. By recursively splitting the search space into regions with high/low function values, recent works like LaNAS shows good performance in Neural Architecture Search (NAS), reducing the sample complexity empirically. In this paper, we coin LA-MCTS that extends LaNAS to other domains. Unlike previous approaches, LA-MCTS learns the partition of the search space using a few samples and their function values in an online fashion. While LaNAS uses linear partition and performs uniform sampling in each region, our LA-MCTS adopts a nonlinear decision boundary and learns a local model to pick good candidates. If the nonlinear partition function and the local model fits well with ground-truth black-box function, then good partitions and candidates can be reached with much fewer samples. LA-MCTS serves as a \emph{meta-algorithm} by using existing black-box optimizers (e.g., BO, TuRBO) as its local models, achieving strong performance in general black-box optimization and reinforcement learning benchmarks, in particular for high-dimensional problems.
翻译:高维黑盒优化具有广泛的应用,但仍是一个难以解决的难题。 在一组样本中, 以 $vx_i, y_ i_ $ 建立全球模型( 如 Bayesian Optimination (BO) ) 在高维搜索空间中受到维度的诅咒, 而贪婪的搜索可能导致次优化。 通过将搜索空间分解到高/低功能值区域, LaNAS 等最近的工作显示神经结构搜索(NAS) 的绩效良好, 从而从经验上降低样本的复杂程度。 在本文中, 我们用LA- MCTS 将LA- MCTS 扩展到其它域。 与以往的做法不同, LA- MCTS 以在线方式使用少数样本及其函数值来学习搜索空间的分布。 虽然 LaNAS 使用线性分割法和在每个区域进行统一的取样, 我们的LA- MCTS 采用非线性决定边界分割功能和本地模型来挑选好的候选人。 如果非线性分隔函数和本地模型与地基黑框的强黑框功能功能功能相匹配, 然后是好的分区和候选人, 以普通的升级, 以低级的样本 达到特定的学习 。