We propose Local Momentum Tracking (LMT), a novel distributed stochastic gradient method for solving distributed optimization problems over networks. To reduce communication overhead, LMT enables each agent to perform multiple local updates between consecutive communication rounds. Specifically, LMT integrates local updates with the momentum tracking strategy and the Loopless Chebyshev Acceleration (LCA) technique. We demonstrate that LMT achieves linear speedup with respect to the number of local updates as well as the number of agents for minimizing smooth objective functions with and without the Polyak-{\L}ojasiewicz (PL) condition. Notably, with sufficiently many local updates $Q\geq Q^*$, LMT attains the optimal communication complexity. For a moderate number of local updates $Q\in[1,Q^*]$, LMT achieves the optimal iteration complexity. To our knowledge, LMT is the first distributed stochastic gradient method with local updates that enjoys such properties.
翻译:我们提出了局部动量追踪(LMT),一种用于解决网络分布式优化问题的新型分布式随机梯度方法。为降低通信开销,LMT允许每个智能体在连续通信轮次之间执行多次局部更新。具体而言,LMT将局部更新与动量追踪策略及无环切比雪夫加速(LCA)技术相结合。我们证明,在最小化满足或不满足Polyak-{\L}ojasiewicz(PL)条件的光滑目标函数时,LMT能实现关于局部更新次数与智能体数量的线性加速。值得注意的是,当局部更新次数足够多($Q\geq Q^*$)时,LMT可达到最优通信复杂度;当局部更新次数适中($Q\in[1,Q^*]$)时,LMT则实现最优迭代复杂度。据我们所知,LMT是首个具备此类特性的支持局部更新的分布式随机梯度方法。