Federated learning is inherently hampered by data heterogeneity: non-i.i.d. training data over local clients. We propose a novel model training approach for federated learning, FLex&Chill, which exploits the Logit Chilling method. Through extensive evaluations, we demonstrate that, in the presence of non-i.i.d. data characteristics inherent in federated learning systems, this approach can expedite model convergence and improve inference accuracy. Quantitatively, from our experiments, we observe up to 6X improvement in the global federated learning model convergence time, and up to 3.37% improvement in inference accuracy.
翻译:联邦学习本质上受到数据异构性的制约:本地客户端上的训练数据是非独立同分布的。我们提出了一种新颖的联邦学习模型训练方法FLex&Chill,该方法利用了Logit Chilling技术。通过广泛的评估,我们证明,在联邦学习系统固有的非独立同分布数据特性存在的情况下,该方法能够加速模型收敛并提高推理准确率。定量实验结果表明,我们观察到全局联邦学习模型的收敛时间最多可提升6倍,推理准确率最多可提升3.37%。