We study the problem of estimating the mean of a distribution in high dimensions when either the samples are adversarially corrupted or the distribution is heavy-tailed. Recent developments in robust statistics have established efficient and (near) optimal procedures for both settings. However, the algorithms developed on each side tend to be sophisticated and do not directly transfer to the other, with many of them having ad-hoc or complicated analyses. In this paper, we provide a meta-problem and a duality theorem that lead to a new unified view on robust and heavy-tailed mean estimation in high dimensions. We show that the meta-problem can be solved either by a variant of the Filter algorithm from the recent literature on robust estimation or by the quantum entropy scoring scheme (QUE), due to Dong, Hopkins and Li (NeurIPS '19). By leveraging our duality theorem, these results translate into simple and efficient algorithms for both robust and heavy-tailed settings. Furthermore, the QUE-based procedure has run-time that matches the fastest known algorithms on both fronts. Our analysis of Filter is through the classic regret bound of the multiplicative weights update method. This connection allows us to avoid the technical complications in previous works and improve upon the run-time analysis of a gradient-descent-based algorithm for robust mean estimation by Cheng, Diakonikolas, Ge and Soltanolkotabi (ICML '20).
翻译:我们研究的是,当样本存在对抗性腐蚀或分布过于繁琐时,如何估计高维分布值的平均值;最近稳健统计的发展为两种环境确立了高效和(近近)的最佳程序;然而,由于东、霍普金斯和李(NeurIPS'19)的原因,每侧开发的算法往往很复杂,不会直接转移给对方,而其中许多算法具有临时的或复杂的分析;在本文件中,我们提供了一个元问题和双重性理论,导致对高维度的稳健和重尾细度平均估算产生新的统一看法;我们表明,元问题可以通过从最近关于稳健估算的文献或量子增量评分计划(QUE)的筛选算法变异来解决;由于东、霍普金斯和李(NeurIPS'19)的原因,这些算法往往很复杂,因此这些结果转化成一个简单而高效的算法,既适用于稳健又重的环境下。此外,基于QUE的程序已经运行了与两侧已知的最快的算法相匹配的时间。我们对过滤法的分析是通过一个典型的递制式的递校程分析法,使我们得以避免了对平级的递化的递化的递增式的递增式计算方法,从而得以改进了对平级的递制的递制的递制式的递制的递制式的递制式的递制式的递制式的递制。