This paper examines the performance of ridge regression in reproducing kernel Hilbert spaces in the presence of noise that exhibits a finite number of higher moments. We establish excess risk bounds consisting of subgaussian and polynomial terms based on the well known integral operator framework. The dominant subgaussian component allows to achieve convergence rates that have previously only been derived under subexponential noise - a prevalent assumption in related work from the last two decades. These rates are optimal under standard eigenvalue decay conditions, demonstrating the asymptotic robustness of regularized least squares against heavy-tailed noise. Our derivations are based on a Fuk-Nagaev inequality for Hilbert-space valued random variables.
翻译:本文研究了在噪声具有有限高阶矩的情况下,再生核希尔伯特空间中岭回归的性能。基于著名的积分算子框架,我们建立了由亚高斯项和多项式项组成的超额风险界。其中占主导地位的亚高斯项使得我们能够达到此前仅在亚指数噪声假设下推导出的收敛速率——这是过去二十年相关工作中普遍采用的假设。在标准特征值衰减条件下,这些速率是最优的,从而证明了正则化最小二乘方法对重尾噪声的渐近鲁棒性。我们的推导基于希尔伯特空间值随机变量的Fuk-Nagaev不等式。