We study bilevel optimization problems where the lower-level problems are strongly convex and have coupled linear constraints. To overcome the potential non-smoothness of the hyper-objective and the computational challenges associated with the Hessian matrix, we utilize penalty and augmented Lagrangian methods to reformulate the original problem as a single-level one. Especially, we establish a strong theoretical connection between the reformulated function and the original hyper-objective by characterizing the closeness of their values and derivatives. Based on this reformulation, we propose a single-loop, first-order algorithm for linearly constrained bilevel optimization (SFLCB). We provide rigorous analyses of its non-asymptotic convergence rates, showing an improvement over prior double-loop algorithms -- form $O(ε^{-3}\log(ε^{-1}))$ to $O(ε^{-3})$. The experiments corroborate our theoretical findings and demonstrate the practical efficiency of the proposed SFLCB algorithm. Simulation code is provided at https://github.com/ShenGroup/SFLCB.
翻译:本文研究具有强凸下层问题及耦合线性约束的双层优化问题。为克服超目标函数可能存在的非光滑性及与Hessian矩阵相关的计算挑战,我们采用惩罚函数法和增广拉格朗日法将原问题重构为单层优化问题。特别地,通过刻画重构函数与原始超目标函数在函数值及导数层面的逼近程度,我们建立了二者之间严格的理论关联。基于此重构形式,我们提出了一种用于线性约束双层优化的单循环一阶算法(SFLCB)。我们对其非渐近收敛速率进行了严格分析,证明其相较于现有双循环算法实现了从$O(ε^{-3}\log(ε^{-1}))$到$O(ε^{-3})$的改进。实验数据验证了理论结论,并证明了所提SFLCB算法的实际效率。仿真代码发布于https://github.com/ShenGroup/SFLCB。