We study least squares linear regression over $N$ uncorrelated Gaussian features that are selected in order of decreasing variance. When the number of selected features $p$ is at most the sample size $n$, the estimator under consideration coincides with the principal component regression estimator; when $p>n$, the estimator is the least $\ell_2$ norm solution over the selected features. We give an average-case analysis of the out-of-sample prediction error as $p,n,N \to \infty$ with $p/N \to \alpha$ and $n/N \to \beta$, for some constants $\alpha \in [0,1]$ and $\beta \in (0,1)$. In this average-case setting, the prediction error exhibits a "double descent" shape as a function of $p$. We also establish conditions under which the minimum risk is achieved in the interpolating ($p>n$) regime.
翻译:我们根据差异的下降程度,对选定的非color相关高斯元特征进行最小方形线性回归研究。当选定特征的数量以美元为最高样本大小时,所考虑的估算值与主元回归量估计值相吻合;当美元=n美元时,估计值是选定特征中最小的0.2美元标准解决方案。我们用美元=n,N\to\inty$=infty$,以美元/n\to\alpha$和美元/n/N\\to\beta$为单位,某些常数为$\alpha\in[0,1美元]和$\beta\in(0,1美元)为美元。在这种平均情况下,预测误差显示“双位”形状的函数是$p美元。我们还确定了在内部系统实现最低风险的条件($p>n)。