Some continuous optimization methods can be connected to ordinary differential equations (ODEs) by taking continuous limits, and their convergence rates can be explained by the ODEs. However, since such ODEs can achieve any convergence rate by time scaling, the correspondence is not as straightforward as usually expected, and deriving new methods through ODEs is not quite direct. In this letter, we pay attention to stability restriction in discretizing ODEs and show that acceleration by time scaling basically implies deceleration in discretization; they balance out so that we can define an attainable unique convergence rate which we call an "essential convergence rate".
翻译:某些连续优化方法可以通过连续限制与普通差异方程式(ODEs)相连,其趋同率可以用ODEs解释。然而,由于这种ODEs通过时间缩放可以实现任何趋同率,通信并不象通常预期的那么简单,通过ODE得出新方法并不十分直接。在本信,我们关注离散的ODE的稳定性限制,并表明通过时间缩放加速基本上意味着离散的减速;它们平衡起来,以便我们能够确定一种我们称之为“基本趋同率”的可实现的独特趋同率。